Oct 03 00:06:45 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 00:06:45 crc restorecon[4648]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:45 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 00:06:46 crc restorecon[4648]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 00:06:47 crc kubenswrapper[4832]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.130011 4832 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139381 4832 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139411 4832 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139418 4832 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139429 4832 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139437 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139443 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139448 4832 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139453 4832 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139458 4832 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139464 4832 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139469 4832 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139474 4832 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139479 4832 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139499 4832 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139505 4832 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139509 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139515 4832 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139520 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139525 4832 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139530 4832 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139534 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139539 4832 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139543 4832 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139547 4832 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139551 4832 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139558 4832 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139565 4832 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139570 4832 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139588 4832 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139594 4832 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139599 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139604 4832 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139608 4832 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139611 4832 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139615 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139619 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139623 4832 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139627 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139631 4832 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139635 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139638 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139642 4832 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139646 4832 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139649 4832 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139653 4832 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139657 4832 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139661 4832 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139664 4832 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139669 4832 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139673 4832 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139678 4832 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139682 4832 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139686 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139690 4832 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139694 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139699 4832 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139705 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139709 4832 feature_gate.go:330] unrecognized feature gate: Example Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139714 4832 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139719 4832 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139723 4832 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139727 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139731 4832 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139735 4832 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139739 4832 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139774 4832 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139779 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139791 4832 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139797 4832 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139801 4832 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.139806 4832 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.141815 4832 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.141859 4832 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.141879 4832 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.141979 4832 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.141992 4832 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142002 4832 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142015 4832 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142027 4832 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142037 4832 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142047 4832 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142058 4832 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142068 4832 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142078 4832 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142087 4832 flags.go:64] FLAG: --cgroup-root="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142096 4832 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142105 4832 flags.go:64] FLAG: --client-ca-file="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142114 4832 flags.go:64] FLAG: --cloud-config="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142124 4832 flags.go:64] FLAG: --cloud-provider="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142133 4832 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142144 4832 flags.go:64] FLAG: --cluster-domain="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142154 4832 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142164 4832 flags.go:64] FLAG: --config-dir="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142175 4832 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142187 4832 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142198 4832 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142207 4832 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142216 4832 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142226 4832 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142236 4832 flags.go:64] FLAG: --contention-profiling="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142245 4832 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142254 4832 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142264 4832 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142273 4832 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142285 4832 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142296 4832 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142305 4832 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142314 4832 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142323 4832 flags.go:64] FLAG: --enable-server="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142332 4832 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142344 4832 flags.go:64] FLAG: --event-burst="100" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142354 4832 flags.go:64] FLAG: --event-qps="50" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142363 4832 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142372 4832 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142381 4832 flags.go:64] FLAG: --eviction-hard="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142392 4832 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142402 4832 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142412 4832 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142421 4832 flags.go:64] FLAG: --eviction-soft="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142473 4832 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142483 4832 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142492 4832 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142501 4832 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142510 4832 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142519 4832 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142528 4832 flags.go:64] FLAG: --feature-gates="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142539 4832 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142548 4832 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142557 4832 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142567 4832 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142576 4832 flags.go:64] FLAG: --healthz-port="10248" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142585 4832 flags.go:64] FLAG: --help="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142594 4832 flags.go:64] FLAG: --hostname-override="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142603 4832 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142613 4832 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142622 4832 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142633 4832 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142643 4832 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142652 4832 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142662 4832 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142672 4832 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142681 4832 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142690 4832 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142700 4832 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142709 4832 flags.go:64] FLAG: --kube-reserved="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142718 4832 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142728 4832 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142737 4832 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142746 4832 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142806 4832 flags.go:64] FLAG: --lock-file="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142823 4832 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142837 4832 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142850 4832 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142888 4832 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142901 4832 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142913 4832 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142926 4832 flags.go:64] FLAG: --logging-format="text" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142937 4832 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142950 4832 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142962 4832 flags.go:64] FLAG: --manifest-url="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142972 4832 flags.go:64] FLAG: --manifest-url-header="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142984 4832 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.142994 4832 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143005 4832 flags.go:64] FLAG: --max-pods="110" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143015 4832 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143024 4832 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143034 4832 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143076 4832 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143086 4832 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143095 4832 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143105 4832 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143126 4832 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143136 4832 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143145 4832 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143155 4832 flags.go:64] FLAG: --pod-cidr="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143163 4832 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143180 4832 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143189 4832 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143198 4832 flags.go:64] FLAG: --pods-per-core="0" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143207 4832 flags.go:64] FLAG: --port="10250" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143217 4832 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143226 4832 flags.go:64] FLAG: --provider-id="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143235 4832 flags.go:64] FLAG: --qos-reserved="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143244 4832 flags.go:64] FLAG: --read-only-port="10255" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143253 4832 flags.go:64] FLAG: --register-node="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143262 4832 flags.go:64] FLAG: --register-schedulable="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143271 4832 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143289 4832 flags.go:64] FLAG: --registry-burst="10" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143301 4832 flags.go:64] FLAG: --registry-qps="5" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143313 4832 flags.go:64] FLAG: --reserved-cpus="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143324 4832 flags.go:64] FLAG: --reserved-memory="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143339 4832 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143351 4832 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143362 4832 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143373 4832 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143382 4832 flags.go:64] FLAG: --runonce="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143392 4832 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143402 4832 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143414 4832 flags.go:64] FLAG: --seccomp-default="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143425 4832 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143436 4832 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143449 4832 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143462 4832 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143473 4832 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143484 4832 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143496 4832 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143507 4832 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143518 4832 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143530 4832 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143541 4832 flags.go:64] FLAG: --system-cgroups="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143552 4832 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143571 4832 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143584 4832 flags.go:64] FLAG: --tls-cert-file="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143595 4832 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143612 4832 flags.go:64] FLAG: --tls-min-version="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143701 4832 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143716 4832 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143727 4832 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143739 4832 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143784 4832 flags.go:64] FLAG: --v="2" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143809 4832 flags.go:64] FLAG: --version="false" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143824 4832 flags.go:64] FLAG: --vmodule="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143837 4832 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.143850 4832 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144112 4832 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144129 4832 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144138 4832 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144146 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144155 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144164 4832 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144172 4832 feature_gate.go:330] unrecognized feature gate: Example Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144180 4832 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144189 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144201 4832 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144209 4832 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144218 4832 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144226 4832 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144234 4832 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144242 4832 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144251 4832 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144259 4832 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144267 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144274 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144282 4832 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144291 4832 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144299 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144307 4832 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144314 4832 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144322 4832 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144330 4832 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144339 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144349 4832 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144423 4832 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144436 4832 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144446 4832 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144456 4832 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144466 4832 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144477 4832 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144487 4832 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144497 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144507 4832 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144520 4832 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144530 4832 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144541 4832 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144551 4832 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144562 4832 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144573 4832 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144583 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144593 4832 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144604 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144614 4832 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144624 4832 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144635 4832 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144645 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144655 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144666 4832 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144680 4832 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144694 4832 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144706 4832 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144717 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144730 4832 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144742 4832 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144794 4832 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144806 4832 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144818 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144828 4832 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144841 4832 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144851 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144864 4832 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144874 4832 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144885 4832 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144899 4832 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144910 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144921 4832 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.144934 4832 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.145783 4832 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.156615 4832 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.156640 4832 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156699 4832 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156705 4832 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156709 4832 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156714 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156717 4832 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156721 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156725 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156729 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156734 4832 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156738 4832 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156742 4832 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156747 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156834 4832 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156839 4832 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156843 4832 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156846 4832 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156850 4832 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156854 4832 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156858 4832 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156862 4832 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156866 4832 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156870 4832 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156874 4832 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156878 4832 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156882 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156886 4832 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156890 4832 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156894 4832 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156899 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156902 4832 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156907 4832 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156913 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156917 4832 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156920 4832 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156926 4832 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156931 4832 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156935 4832 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156938 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156942 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156946 4832 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156949 4832 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156953 4832 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156956 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156959 4832 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156963 4832 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156966 4832 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156970 4832 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156973 4832 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156977 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156980 4832 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156983 4832 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156987 4832 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156991 4832 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156995 4832 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.156998 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157002 4832 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157005 4832 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157008 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157012 4832 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157015 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157019 4832 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157022 4832 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157025 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157029 4832 feature_gate.go:330] unrecognized feature gate: Example Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157032 4832 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157036 4832 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157039 4832 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157043 4832 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157046 4832 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157050 4832 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157054 4832 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.157060 4832 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157173 4832 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157179 4832 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157183 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157187 4832 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157191 4832 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157195 4832 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157199 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157203 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157207 4832 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157210 4832 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157215 4832 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157218 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157223 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157226 4832 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157230 4832 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157234 4832 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157237 4832 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157241 4832 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157245 4832 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157249 4832 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157253 4832 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157256 4832 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157260 4832 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157263 4832 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157267 4832 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157270 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157273 4832 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157278 4832 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157283 4832 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157288 4832 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157292 4832 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157296 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157300 4832 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157304 4832 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157309 4832 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157314 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157319 4832 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157324 4832 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157328 4832 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157333 4832 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157337 4832 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157341 4832 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157345 4832 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157348 4832 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157352 4832 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157355 4832 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157359 4832 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157362 4832 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157366 4832 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157369 4832 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157373 4832 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157376 4832 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157380 4832 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157383 4832 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157387 4832 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157390 4832 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157393 4832 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157397 4832 feature_gate.go:330] unrecognized feature gate: Example Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157400 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157404 4832 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157407 4832 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157411 4832 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157415 4832 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157418 4832 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157422 4832 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157425 4832 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157428 4832 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157432 4832 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157435 4832 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157439 4832 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.157443 4832 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.157449 4832 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.157615 4832 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.161855 4832 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.161937 4832 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.163590 4832 server.go:997] "Starting client certificate rotation" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.163615 4832 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.163891 4832 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-21 22:58:53.582295881 +0000 UTC Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.164068 4832 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1198h52m6.418233411s for next certificate rotation Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.188390 4832 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.191380 4832 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.211123 4832 log.go:25] "Validated CRI v1 runtime API" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.246025 4832 log.go:25] "Validated CRI v1 image API" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.248551 4832 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.254171 4832 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-00-02-37-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.254236 4832 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.288636 4832 manager.go:217] Machine: {Timestamp:2025-10-03 00:06:47.284917758 +0000 UTC m=+0.547072954 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:22c336a9-c598-416a-a85e-c54be0a6a3c5 BootID:1e5f816a-2677-4a9b-8b73-ae876a2ba5f2 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a7:bd:3d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a7:bd:3d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:82:d9:be Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a0:a1:f8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6c:2a:19 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:98:fb:7c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:39:3d:7b:10:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:11:0e:08:d8:05 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.289106 4832 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.289395 4832 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.292184 4832 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.292525 4832 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.292584 4832 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.292913 4832 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.292932 4832 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.293481 4832 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.293528 4832 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.293941 4832 state_mem.go:36] "Initialized new in-memory state store" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.294079 4832 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.297672 4832 kubelet.go:418] "Attempting to sync node with API server" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.297775 4832 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.297835 4832 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.297858 4832 kubelet.go:324] "Adding apiserver pod source" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.297878 4832 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.303257 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.303427 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.303606 4832 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.304125 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.304329 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.305646 4832 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.308171 4832 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310167 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310207 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310221 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310232 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310250 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310264 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310276 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310296 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310311 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310323 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310341 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.310353 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.312877 4832 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.313574 4832 server.go:1280] "Started kubelet" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.316130 4832 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.317563 4832 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.316842 4832 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 00:06:47 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.318522 4832 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322436 4832 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322501 4832 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322560 4832 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 12:40:34.970930986 +0000 UTC Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322611 4832 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 996h33m47.648324802s for next certificate rotation Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322674 4832 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322691 4832 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322716 4832 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.322840 4832 server.go:460] "Adding debug handlers to kubelet server" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.322914 4832 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.323426 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.323607 4832 factory.go:55] Registering systemd factory Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.323624 4832 factory.go:221] Registration of the systemd container factory successfully Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.324111 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.324238 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.328526 4832 factory.go:153] Registering CRI-O factory Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.329016 4832 factory.go:221] Registration of the crio container factory successfully Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.329151 4832 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.329182 4832 factory.go:103] Registering Raw factory Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.329208 4832 manager.go:1196] Started watching for new ooms in manager Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.328422 4832 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ad275adfb3251 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 00:06:47.313527377 +0000 UTC m=+0.575682523,LastTimestamp:2025-10-03 00:06:47.313527377 +0000 UTC m=+0.575682523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.330058 4832 manager.go:319] Starting recovery of all containers Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334081 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334131 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334146 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334196 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334209 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334223 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334237 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334251 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334270 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334285 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334297 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334311 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334323 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334339 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334352 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334363 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334374 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334387 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334400 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334412 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334480 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334496 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334508 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334523 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334537 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334551 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334584 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334600 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334614 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334629 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334641 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334654 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334667 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334682 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334695 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334712 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334725 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334772 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334786 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334799 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334811 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334824 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334837 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334852 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334866 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334882 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334894 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334910 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334923 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334936 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334949 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334963 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.334985 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335033 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335049 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335063 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335077 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335090 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335104 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335141 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335153 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335168 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335180 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335218 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335231 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335243 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335255 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335271 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335283 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335296 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335310 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335323 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335374 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335392 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335410 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335424 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335437 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335452 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335464 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335477 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335489 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335503 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335514 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335529 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335542 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335556 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335569 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335584 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335596 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335608 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335621 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335634 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335648 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335663 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335675 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335687 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335699 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335714 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335726 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335738 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335768 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335782 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335795 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335809 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335829 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335843 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335857 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335870 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335883 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335899 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335911 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335927 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335941 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335955 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335971 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335983 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.335995 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339342 4832 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339407 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339429 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339445 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339475 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339489 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339506 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339521 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339537 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339550 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339564 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339580 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339596 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339615 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339630 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339644 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339658 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339674 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339688 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339702 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339719 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339732 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.339746 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340836 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340860 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340875 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340890 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340906 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340921 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340937 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.340953 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341001 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341018 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341032 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341047 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341061 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341076 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341091 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341109 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341124 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341142 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341158 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341174 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341193 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341207 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341234 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341249 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341264 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341278 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341292 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341305 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341319 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341333 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341347 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341361 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341376 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341394 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341407 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341421 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341434 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341449 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341462 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341475 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341488 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341505 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341518 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341533 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341547 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341562 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341577 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341590 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341605 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341621 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341634 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341647 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341660 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341673 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341687 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341699 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341713 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341731 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341745 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341784 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341801 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341814 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341828 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341841 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341855 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341870 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341886 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341900 4832 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341911 4832 reconstruct.go:97] "Volume reconstruction finished" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.341920 4832 reconciler.go:26] "Reconciler: start to sync state" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.349512 4832 manager.go:324] Recovery completed Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.365492 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.367418 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.367510 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.367542 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.368695 4832 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.368715 4832 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.368740 4832 state_mem.go:36] "Initialized new in-memory state store" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.383214 4832 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.385130 4832 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.385194 4832 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.385253 4832 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.385302 4832 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.386133 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.386231 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.394279 4832 policy_none.go:49] "None policy: Start" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.395655 4832 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.395848 4832 state_mem.go:35] "Initializing new in-memory state store" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.423601 4832 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.449173 4832 manager.go:334] "Starting Device Plugin manager" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.449966 4832 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.449995 4832 server.go:79] "Starting device plugin registration server" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.450547 4832 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.450568 4832 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.451556 4832 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.451888 4832 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.451903 4832 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.457371 4832 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.485934 4832 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.486042 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.487196 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.487226 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.487237 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.487359 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488194 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488228 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488262 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488718 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488731 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488808 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.488960 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.489059 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.489855 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.489885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.489896 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490052 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490300 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490381 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490392 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490410 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490385 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490425 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490442 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.490965 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491010 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491022 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491181 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491302 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491337 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491401 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491467 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.491486 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492096 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492321 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492376 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492389 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492742 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.492805 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.494151 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.494188 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.494204 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.525079 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543413 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543459 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543518 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543536 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543552 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543610 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543641 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543657 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543673 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543736 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543852 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543904 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543939 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.543958 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.544099 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.551770 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.552947 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.552978 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.552988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.553011 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.553457 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646556 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646611 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646630 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646645 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646661 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646677 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646694 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646708 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646724 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646738 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646794 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646826 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646847 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646862 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646876 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646962 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.646961 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647065 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647076 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647117 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647116 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647089 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647156 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647196 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647237 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647199 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647225 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647250 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647281 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.647196 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.754086 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.755224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.755279 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.755291 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.755313 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.755621 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.832457 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.840772 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.863225 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.870890 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: I1003 00:06:47.876214 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.877975 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-489a850da7a144e0df6b0d884756403613b324791e133f6505fcfa65e71cc716 WatchSource:0}: Error finding container 489a850da7a144e0df6b0d884756403613b324791e133f6505fcfa65e71cc716: Status 404 returned error can't find the container with id 489a850da7a144e0df6b0d884756403613b324791e133f6505fcfa65e71cc716 Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.895387 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d4ed44e80c52b908874749bdfe213d8124589de348492a067b2040ff681af8db WatchSource:0}: Error finding container d4ed44e80c52b908874749bdfe213d8124589de348492a067b2040ff681af8db: Status 404 returned error can't find the container with id d4ed44e80c52b908874749bdfe213d8124589de348492a067b2040ff681af8db Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.899124 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f10ac256534513ec66580672540e3762fc65853834d9794a583134e2180fbc5b WatchSource:0}: Error finding container f10ac256534513ec66580672540e3762fc65853834d9794a583134e2180fbc5b: Status 404 returned error can't find the container with id f10ac256534513ec66580672540e3762fc65853834d9794a583134e2180fbc5b Oct 03 00:06:47 crc kubenswrapper[4832]: W1003 00:06:47.907385 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f815907a55fd788eddc27fd7e0205a2c6b2d879cf06f0d320c1f8148350aae7d WatchSource:0}: Error finding container f815907a55fd788eddc27fd7e0205a2c6b2d879cf06f0d320c1f8148350aae7d: Status 404 returned error can't find the container with id f815907a55fd788eddc27fd7e0205a2c6b2d879cf06f0d320c1f8148350aae7d Oct 03 00:06:47 crc kubenswrapper[4832]: E1003 00:06:47.926942 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.156114 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.159364 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.159438 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.159459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.159507 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.160336 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.320344 4832 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:48 crc kubenswrapper[4832]: W1003 00:06:48.347717 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.347967 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:48 crc kubenswrapper[4832]: W1003 00:06:48.385879 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.386229 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.392479 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f815907a55fd788eddc27fd7e0205a2c6b2d879cf06f0d320c1f8148350aae7d"} Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.393733 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f10ac256534513ec66580672540e3762fc65853834d9794a583134e2180fbc5b"} Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.395343 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4ed44e80c52b908874749bdfe213d8124589de348492a067b2040ff681af8db"} Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.396737 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"489a850da7a144e0df6b0d884756403613b324791e133f6505fcfa65e71cc716"} Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.398007 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0742a7d55eaab4032d366bf3edd11bb2893949578645f916b54366c3e5650f91"} Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.728010 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s" Oct 03 00:06:48 crc kubenswrapper[4832]: W1003 00:06:48.838296 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.838386 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:48 crc kubenswrapper[4832]: W1003 00:06:48.860080 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.860148 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.960891 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.963304 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.963343 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.963354 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:48 crc kubenswrapper[4832]: I1003 00:06:48.963377 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:48 crc kubenswrapper[4832]: E1003 00:06:48.964113 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.320016 4832 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.404067 4832 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b" exitCode=0 Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.404186 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.404301 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.405799 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.405845 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.405857 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.406982 4832 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990" exitCode=0 Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.407277 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.407311 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.408365 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.408403 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.408413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.411648 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.411704 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.411724 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.411735 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.411797 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.413250 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.413308 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.413328 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.415797 4832 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e" exitCode=0 Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.415905 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.415980 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.417092 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.417143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.417162 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.420439 4832 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1efc7e9fce6c4e65f08d6ee3332f2f7acf8bc961205a8daf67e1474e47ecedd4" exitCode=0 Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.420498 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1efc7e9fce6c4e65f08d6ee3332f2f7acf8bc961205a8daf67e1474e47ecedd4"} Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.420568 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.422085 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.422116 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.422130 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.424619 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.425957 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.425982 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:49 crc kubenswrapper[4832]: I1003 00:06:49.425994 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:49 crc kubenswrapper[4832]: E1003 00:06:49.792169 4832 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ad275adfb3251 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 00:06:47.313527377 +0000 UTC m=+0.575682523,LastTimestamp:2025-10-03 00:06:47.313527377 +0000 UTC m=+0.575682523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.319845 4832 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.328088 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:50 crc kubenswrapper[4832]: E1003 00:06:50.328737 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.428558 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.428612 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.428626 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.428638 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.437183 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2ed239b8b27adf25663f858ed68cf003cc0a2890127fe20c2b0ee7ce9fa1ab2a"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.437260 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.438433 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.438470 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.438480 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.440269 4832 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e" exitCode=0 Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.440331 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.440465 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.441264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.441281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.441290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.443482 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.443868 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.444188 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.444212 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.444224 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b"} Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445160 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445179 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445187 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445707 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445727 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.445761 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.571843 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.573728 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.573782 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.573795 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:50 crc kubenswrapper[4832]: I1003 00:06:50.573815 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:50 crc kubenswrapper[4832]: E1003 00:06:50.574410 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.447929 4832 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10" exitCode=0 Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.447993 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10"} Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.448015 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.449950 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.450008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.450027 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.452680 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474"} Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.452727 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.452802 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.452891 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.453267 4832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.453301 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454238 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454265 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454277 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454239 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454339 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454349 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454366 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454486 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:51 crc kubenswrapper[4832]: I1003 00:06:51.454505 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.364615 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465271 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776"} Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465339 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d"} Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465365 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b"} Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465385 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314"} Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465389 4832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.465476 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.466856 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.466908 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:52 crc kubenswrapper[4832]: I1003 00:06:52.466921 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.105162 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.106197 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.109029 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.109084 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.109099 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.328953 4832 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.329037 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.473196 4832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.473258 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.473916 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474048 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8"} Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474457 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474559 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474650 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474743 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474800 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.474813 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.775386 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.777829 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.777867 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.777882 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:53 crc kubenswrapper[4832]: I1003 00:06:53.777911 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.440059 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.476176 4832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.476237 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.476268 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.477976 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.477978 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.478033 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.478107 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.478066 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.478179 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:54 crc kubenswrapper[4832]: I1003 00:06:54.942542 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.241705 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.241904 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.243477 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.243597 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.243662 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.404158 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.479859 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.479876 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.481616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.481675 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.481694 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.481855 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.481945 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.482029 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.552485 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.553216 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.554264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.554384 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.554457 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:55 crc kubenswrapper[4832]: I1003 00:06:55.559121 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:56 crc kubenswrapper[4832]: I1003 00:06:56.481824 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:56 crc kubenswrapper[4832]: I1003 00:06:56.482932 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:56 crc kubenswrapper[4832]: I1003 00:06:56.482998 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:56 crc kubenswrapper[4832]: I1003 00:06:56.483021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:57 crc kubenswrapper[4832]: E1003 00:06:57.457500 4832 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.131875 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.132070 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.134744 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.134844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.134857 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.139390 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.485859 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.486935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.486994 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:06:58 crc kubenswrapper[4832]: I1003 00:06:58.487012 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.360589 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.360810 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.361843 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.361883 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.361897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:00 crc kubenswrapper[4832]: W1003 00:07:00.961192 4832 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.961273 4832 trace.go:236] Trace[1343897030]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 00:06:50.959) (total time: 10001ms): Oct 03 00:07:00 crc kubenswrapper[4832]: Trace[1343897030]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:00.961) Oct 03 00:07:00 crc kubenswrapper[4832]: Trace[1343897030]: [10.001694921s] [10.001694921s] END Oct 03 00:07:00 crc kubenswrapper[4832]: E1003 00:07:00.961297 4832 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.994955 4832 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 00:07:00 crc kubenswrapper[4832]: I1003 00:07:00.995011 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 00:07:01 crc kubenswrapper[4832]: I1003 00:07:01.001716 4832 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 00:07:01 crc kubenswrapper[4832]: I1003 00:07:01.001961 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.370634 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.371383 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.371781 4832 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.371848 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.372570 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.372594 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.372604 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.375343 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.496188 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.496695 4832 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.496772 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.497280 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.497409 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:02 crc kubenswrapper[4832]: I1003 00:07:02.497499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:03 crc kubenswrapper[4832]: I1003 00:07:03.328862 4832 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 00:07:03 crc kubenswrapper[4832]: I1003 00:07:03.328936 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 00:07:04 crc kubenswrapper[4832]: I1003 00:07:04.858498 4832 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 00:07:04 crc kubenswrapper[4832]: I1003 00:07:04.943828 4832 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 00:07:04 crc kubenswrapper[4832]: I1003 00:07:04.943896 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.309121 4832 apiserver.go:52] "Watching apiserver" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.315297 4832 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.315615 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.316399 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.316680 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:05 crc kubenswrapper[4832]: E1003 00:07:05.316781 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:05 crc kubenswrapper[4832]: E1003 00:07:05.316840 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.316543 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.317024 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.317107 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.317668 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:05 crc kubenswrapper[4832]: E1003 00:07:05.317836 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.320556 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.321331 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.321517 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.323171 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.323216 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.323367 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.323600 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.323884 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.324644 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.326623 4832 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.352144 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.369566 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.384174 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.399666 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.414641 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.430609 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.445627 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.457310 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:05 crc kubenswrapper[4832]: E1003 00:07:05.994174 4832 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.996345 4832 trace.go:236] Trace[534548075]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 00:06:51.694) (total time: 14301ms): Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[534548075]: ---"Objects listed" error: 14301ms (00:07:05.996) Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[534548075]: [14.301740529s] [14.301740529s] END Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.996391 4832 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.996906 4832 trace.go:236] Trace[764558341]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 00:06:51.115) (total time: 14881ms): Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[764558341]: ---"Objects listed" error: 14881ms (00:07:05.996) Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[764558341]: [14.881768428s] [14.881768428s] END Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.997085 4832 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 00:07:05 crc kubenswrapper[4832]: E1003 00:07:05.998747 4832 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.998925 4832 trace.go:236] Trace[338350276]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 00:06:51.387) (total time: 14611ms): Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[338350276]: ---"Objects listed" error: 14611ms (00:07:05.998) Oct 03 00:07:05 crc kubenswrapper[4832]: Trace[338350276]: [14.611401014s] [14.611401014s] END Oct 03 00:07:05 crc kubenswrapper[4832]: I1003 00:07:05.998972 4832 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.000412 4832 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101405 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101491 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101540 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101583 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101630 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101669 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101700 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101733 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101840 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101874 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.101987 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102060 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102092 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102124 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102156 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102213 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102265 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102298 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102331 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102363 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102418 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102452 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102483 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102538 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102570 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102603 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102636 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102673 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102706 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102739 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102811 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102844 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102877 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102909 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102945 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.102977 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103009 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103039 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103083 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103123 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103155 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103189 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103260 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103294 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103359 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103391 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103402 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103538 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103535 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103602 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103782 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103832 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103806 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103865 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103902 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103902 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103936 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103977 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104012 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104011 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104066 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104097 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104149 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104181 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104218 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104258 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104303 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104348 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104384 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104417 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104449 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104482 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104519 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104555 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104588 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104624 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104657 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104691 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104725 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104790 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104831 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104868 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104899 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105137 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105188 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105220 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105255 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105288 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105321 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105361 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105394 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105427 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105462 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105507 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105544 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105594 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105632 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105675 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105711 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105774 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105851 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105887 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105922 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105961 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105994 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106029 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106069 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106110 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106154 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106191 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106235 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106276 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106311 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106350 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106383 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106419 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106458 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106504 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106546 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106592 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106633 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106669 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106709 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106742 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106807 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106844 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106883 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106925 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106959 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107000 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107037 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107076 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107117 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107157 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107193 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107236 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107270 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107311 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107345 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107386 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107422 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107457 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.107494 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108702 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108806 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108867 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108901 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108941 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108973 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109001 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109029 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109066 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109108 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109137 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109168 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109196 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109293 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109375 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109400 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109424 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109450 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109476 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109500 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109523 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109546 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109570 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109592 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109615 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109640 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109663 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109694 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109718 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109792 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109822 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109848 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109874 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109904 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109929 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109956 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109980 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110010 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110036 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110059 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110082 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110109 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110132 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110156 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110181 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110204 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110228 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110254 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110280 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110303 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110329 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110353 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110378 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110411 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110435 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110461 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110485 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110508 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110531 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110556 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110604 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110665 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110693 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110719 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110782 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110815 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110845 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110870 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110901 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110925 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110951 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110978 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111045 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111108 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111131 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111146 4832 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111162 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111177 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111191 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111205 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111219 4832 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104215 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104348 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104467 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104548 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104575 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104735 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104797 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104865 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.104956 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.103703 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105512 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105608 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105668 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105681 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105775 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105954 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.105970 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106002 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.106023 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108638 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108809 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.108838 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109047 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109148 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.113742 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.109909 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110223 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110292 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110517 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.110719 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111080 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111158 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111242 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.111567 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.112283 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.112286 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.112337 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.112666 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.112962 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.113106 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.113468 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.114073 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.114245 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.114447 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.114666 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.114678 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.116062 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.116286 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.116404 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.116691 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.116711 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117328 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117361 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117535 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117563 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117633 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.117922 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.118152 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.118106 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.118500 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.118634 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.118994 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.119148 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.119436 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.119688 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.120054 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.120846 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.120984 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.121440 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.121500 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.121846 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.122096 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.122305 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.122473 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.123072 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.123339 4832 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.123518 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.123655 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.123700 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.124025 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:06.623997711 +0000 UTC m=+19.886152847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.124137 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.124190 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:06.624176147 +0000 UTC m=+19.886331293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.124501 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.124803 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.124708 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.125240 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.125272 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.125373 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.125591 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.125603 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126037 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126267 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126349 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126489 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126815 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126835 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.126993 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127203 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127385 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127405 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127462 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127730 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.127836 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:06.627807246 +0000 UTC m=+19.889962372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.127962 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.128016 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.128206 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.128272 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.128569 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129035 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129045 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129156 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129395 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129647 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.129931 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.130139 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.130392 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.130863 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.130886 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.131521 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.135273 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.135598 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.135843 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.136207 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.142162 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.144053 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.144080 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.144122 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.144143 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.144420 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.144452 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.144477 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.144546 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:06.644522841 +0000 UTC m=+19.906677957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.144908 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.145103 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.145526 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.145743 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.146127 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.146134 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.146772 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.146784 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.147132 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.147186 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.147381 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.147858 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.147963 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.150304 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.150511 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.150618 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.150793 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.150836 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.151106 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.151493 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.151579 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.151617 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152029 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152058 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152138 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152317 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152362 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152429 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152513 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.152975 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.154174 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.154997 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.155155 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.155356 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.156267 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.156290 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.156355 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:06.656332587 +0000 UTC m=+19.918487763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.156535 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.157103 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.157206 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.157336 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.157423 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.157555 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.158280 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.158632 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.158696 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.158701 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.158877 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.159905 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.161169 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.161386 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.161847 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.165254 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.167963 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.170923 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.171027 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.179224 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.182887 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.183469 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.183617 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.183688 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.183847 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.184070 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.185032 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.185682 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.185736 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.185872 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.189651 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.190176 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.190297 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.190508 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.195418 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.201449 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.203988 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.206142 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212385 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212424 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212496 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212510 4832 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212519 4832 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212528 4832 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212536 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212544 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212552 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212560 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212568 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212579 4832 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212587 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212904 4832 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212914 4832 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212922 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212929 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212938 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212945 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212953 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212968 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212977 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212985 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.212995 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213003 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213011 4832 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213018 4832 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213034 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213042 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213050 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213059 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213068 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213075 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213084 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213093 4832 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213101 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213109 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213117 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213124 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213132 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213140 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213148 4832 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213157 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213167 4832 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213175 4832 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213184 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213193 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213202 4832 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213210 4832 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213219 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213227 4832 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213235 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213242 4832 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213251 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213260 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213268 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213275 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213284 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213291 4832 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213299 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213307 4832 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213316 4832 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213324 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213331 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213339 4832 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213347 4832 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213338 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213354 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213396 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213398 4832 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213418 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213431 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213444 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213456 4832 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213467 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213479 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213490 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213501 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213512 4832 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213525 4832 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213556 4832 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213570 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213581 4832 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213593 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213604 4832 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213617 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213628 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213640 4832 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213652 4832 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213662 4832 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213674 4832 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213690 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213701 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213712 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213723 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213735 4832 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213746 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213774 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213785 4832 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213796 4832 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213807 4832 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213831 4832 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213844 4832 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213855 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213867 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213879 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213889 4832 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213900 4832 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213910 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213922 4832 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213934 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213947 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213957 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213968 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213984 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.213995 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214006 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214017 4832 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214028 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214039 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214049 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214059 4832 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214069 4832 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214084 4832 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214094 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214104 4832 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214114 4832 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214124 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214134 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214145 4832 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214155 4832 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214168 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214179 4832 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214195 4832 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214206 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214218 4832 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214229 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214240 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214251 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214262 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214276 4832 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214287 4832 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214299 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214310 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214321 4832 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214332 4832 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214344 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214356 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214369 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214379 4832 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214390 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214401 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214411 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214424 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214436 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214453 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214466 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214478 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214490 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214500 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214510 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214521 4832 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214533 4832 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214544 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214556 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214569 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214579 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214590 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214600 4832 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214611 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214622 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214633 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214644 4832 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214652 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214660 4832 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214668 4832 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214676 4832 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214684 4832 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214692 4832 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214700 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214708 4832 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214715 4832 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214723 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214731 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214739 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214747 4832 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214772 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214789 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.214801 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.217719 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.234852 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.246410 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.252880 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 00:07:06 crc kubenswrapper[4832]: W1003 00:07:06.262903 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-37efc33186f2513de18b867ce26339f8c7f0955470bd2ad25577ae5330ce96bf WatchSource:0}: Error finding container 37efc33186f2513de18b867ce26339f8c7f0955470bd2ad25577ae5330ce96bf: Status 404 returned error can't find the container with id 37efc33186f2513de18b867ce26339f8c7f0955470bd2ad25577ae5330ce96bf Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.315985 4832 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.507416 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.507470 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.507486 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1ebd67f0bcec46fb57612c581c3f26a87e7f1648a1cb45e0534231d19aca5a7c"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.508801 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"37efc33186f2513de18b867ce26339f8c7f0955470bd2ad25577ae5330ce96bf"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.510587 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.510612 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8b496b9f74828ac2ab53dcfadc87351d7aaaba9263aa2171d92792ef7da01a93"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.512221 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.514027 4832 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474" exitCode=255 Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.514054 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474"} Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.522523 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.535318 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.535731 4832 scope.go:117] "RemoveContainer" containerID="af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.536186 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.553338 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.565086 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.579204 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.598603 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.628273 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.645648 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.660177 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.677843 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.699662 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.716792 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.723647 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.723856 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.724123 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:07.724037993 +0000 UTC m=+20.986193099 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.724125 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.727820 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.727922 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728081 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728107 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728147 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728173 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.728040 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728351 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728393 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:07.728201688 +0000 UTC m=+20.990356984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728544 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:07.728528349 +0000 UTC m=+20.990683455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.728635 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728741 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:07.728679644 +0000 UTC m=+20.990834750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728811 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: E1003 00:07:06.728856 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:07.728846419 +0000 UTC m=+20.991001735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:06 crc kubenswrapper[4832]: I1003 00:07:06.733738 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.386042 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.386083 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.386234 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.386466 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.386516 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.386592 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.390604 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.391617 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.392825 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.393995 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.395024 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.395898 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.396967 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.397918 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.401207 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.402117 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.402935 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.403247 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.404602 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.405311 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.406625 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.407540 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.408807 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.409587 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.410235 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.411942 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.412806 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.413419 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.414852 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.415472 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.416905 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.417520 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.419189 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.419339 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.420174 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.421381 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.422357 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.423096 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.424448 4832 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.424587 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.427041 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.428671 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.429419 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.431651 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.434242 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.435152 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.436822 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.437117 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.438045 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.439434 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.440500 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.442112 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.443680 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.444582 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.445948 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.446939 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.448638 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.449531 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.450404 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.451872 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.452571 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.453390 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.453854 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.454365 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.473303 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.494735 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.510821 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.519586 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.521856 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c"} Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.522377 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.536012 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.556087 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.574212 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.606774 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.638717 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.655576 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.673982 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.738538 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.738606 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.738732 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738704 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738823 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738783 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:09.738733606 +0000 UTC m=+23.000888712 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.738877 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:07 crc kubenswrapper[4832]: I1003 00:07:07.738898 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738961 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738971 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.738980 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739008 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:09.738995604 +0000 UTC m=+23.001150710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739046 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739056 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739064 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739084 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:09.739078887 +0000 UTC m=+23.001233993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739099 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:09.739092158 +0000 UTC m=+23.001247254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:07 crc kubenswrapper[4832]: E1003 00:07:07.739109 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:09.739104278 +0000 UTC m=+23.001259384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.385792 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.385888 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.385937 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.385808 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.386022 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.386106 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.528166 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c"} Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.544177 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.558941 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.572451 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.583251 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.598041 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.609098 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.620195 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:09Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.755433 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.755495 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.755516 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.755539 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:09 crc kubenswrapper[4832]: I1003 00:07:09.755559 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755633 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:13.755603964 +0000 UTC m=+27.017759080 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755655 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755726 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756067 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756081 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755668 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756129 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756135 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755667 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.755747 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:13.755725928 +0000 UTC m=+27.017881074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756185 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:13.756172763 +0000 UTC m=+27.018327869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756196 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:13.756191413 +0000 UTC m=+27.018346519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:09 crc kubenswrapper[4832]: E1003 00:07:09.756211 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:13.756206594 +0000 UTC m=+27.018361700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.332912 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.337732 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.343451 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.351806 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.369932 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.384729 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.388048 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.399627 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.399675 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.416025 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.430882 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.445067 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.448095 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.463394 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.477247 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.491939 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.505241 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.549222 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.565318 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.586605 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.603297 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:10 crc kubenswrapper[4832]: I1003 00:07:10.628559 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:10Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.109768 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xzpk4"] Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.110104 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.112328 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b9c9g"] Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.112638 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.121226 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.122963 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.124149 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.125649 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.125898 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.126381 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.126947 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.133617 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.171869 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.184265 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.198393 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.210870 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.223018 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.232227 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.245243 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.262118 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267196 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eeb165-19f7-4e3f-a442-1c9c570d853b-proxy-tls\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267278 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4369beec-ad3f-4de9-909b-7d0f8c76aad3-hosts-file\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267305 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfnk8\" (UniqueName: \"kubernetes.io/projected/13eeb165-19f7-4e3f-a442-1c9c570d853b-kube-api-access-sfnk8\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267335 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvpm6\" (UniqueName: \"kubernetes.io/projected/4369beec-ad3f-4de9-909b-7d0f8c76aad3-kube-api-access-qvpm6\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267355 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/13eeb165-19f7-4e3f-a442-1c9c570d853b-rootfs\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.267376 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13eeb165-19f7-4e3f-a442-1c9c570d853b-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.277222 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.290032 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.305057 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.319986 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.336576 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.348307 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.361648 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368191 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4369beec-ad3f-4de9-909b-7d0f8c76aad3-hosts-file\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368237 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfnk8\" (UniqueName: \"kubernetes.io/projected/13eeb165-19f7-4e3f-a442-1c9c570d853b-kube-api-access-sfnk8\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368262 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvpm6\" (UniqueName: \"kubernetes.io/projected/4369beec-ad3f-4de9-909b-7d0f8c76aad3-kube-api-access-qvpm6\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368278 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/13eeb165-19f7-4e3f-a442-1c9c570d853b-rootfs\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368292 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13eeb165-19f7-4e3f-a442-1c9c570d853b-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368309 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eeb165-19f7-4e3f-a442-1c9c570d853b-proxy-tls\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368316 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4369beec-ad3f-4de9-909b-7d0f8c76aad3-hosts-file\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.368419 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/13eeb165-19f7-4e3f-a442-1c9c570d853b-rootfs\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.369222 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13eeb165-19f7-4e3f-a442-1c9c570d853b-mcd-auth-proxy-config\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.373391 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13eeb165-19f7-4e3f-a442-1c9c570d853b-proxy-tls\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.379065 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.385484 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.385559 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:11 crc kubenswrapper[4832]: E1003 00:07:11.385638 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.385665 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:11 crc kubenswrapper[4832]: E1003 00:07:11.385734 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:11 crc kubenswrapper[4832]: E1003 00:07:11.385796 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.388172 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvpm6\" (UniqueName: \"kubernetes.io/projected/4369beec-ad3f-4de9-909b-7d0f8c76aad3-kube-api-access-qvpm6\") pod \"node-resolver-b9c9g\" (UID: \"4369beec-ad3f-4de9-909b-7d0f8c76aad3\") " pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.391791 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfnk8\" (UniqueName: \"kubernetes.io/projected/13eeb165-19f7-4e3f-a442-1c9c570d853b-kube-api-access-sfnk8\") pod \"machine-config-daemon-xzpk4\" (UID: \"13eeb165-19f7-4e3f-a442-1c9c570d853b\") " pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.403423 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.415745 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.425851 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.426219 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.436378 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b9c9g" Oct 03 00:07:11 crc kubenswrapper[4832]: W1003 00:07:11.437173 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13eeb165_19f7_4e3f_a442_1c9c570d853b.slice/crio-b037994d9ef9ff0de895f15a9de53702a119cdd3830ef92938cd583e552b077d WatchSource:0}: Error finding container b037994d9ef9ff0de895f15a9de53702a119cdd3830ef92938cd583e552b077d: Status 404 returned error can't find the container with id b037994d9ef9ff0de895f15a9de53702a119cdd3830ef92938cd583e552b077d Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.449365 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.467577 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.525031 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rwqt7"] Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.525317 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zdfsz"] Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.525812 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.526070 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.526221 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zc9xl"] Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.528506 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.528515 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.528805 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.530360 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.530838 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.530874 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.530895 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.531122 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.533172 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.533173 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.533876 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"b037994d9ef9ff0de895f15a9de53702a119cdd3830ef92938cd583e552b077d"} Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.535607 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.535808 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.535873 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.536031 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.536080 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.536228 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b9c9g" event={"ID":"4369beec-ad3f-4de9-909b-7d0f8c76aad3","Type":"ContainerStarted","Data":"224b6cf8f7c8c5814dc188d04312d6bf7f358f729113b7b8d8db0509c048d002"} Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.542469 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.550955 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.564459 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.581051 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.604331 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.623863 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.643991 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.658084 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669266 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-multus\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669314 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm92t\" (UniqueName: \"kubernetes.io/projected/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-kube-api-access-nm92t\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669343 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-k8s-cni-cncf-io\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669367 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-daemon-config\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669389 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-multus-certs\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669411 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-system-cni-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669431 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cnibin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669455 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-hostroot\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669613 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-conf-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669666 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-cnibin\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669719 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlv87\" (UniqueName: \"kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669746 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-bin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669799 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669824 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-kubelet\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669888 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669936 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cni-binary-copy\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669961 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-socket-dir-parent\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.669994 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-os-release\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670041 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670173 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670205 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670231 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9vv\" (UniqueName: \"kubernetes.io/projected/66a6f849-6870-4e28-8ca4-c849c22ad252-kube-api-access-gf9vv\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670255 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670277 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-binary-copy\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670313 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670335 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670368 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-system-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670396 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670432 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670454 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670475 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670498 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670519 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-os-release\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670539 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-netns\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670564 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670588 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670610 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670630 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670651 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670674 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670695 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670724 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-etc-kubernetes\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670779 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.670805 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.674689 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.692977 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.718369 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.745106 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771850 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-daemon-config\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771885 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-k8s-cni-cncf-io\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771900 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-multus-certs\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771915 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-system-cni-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771928 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-hostroot\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771943 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-conf-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.771957 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-cnibin\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772013 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-system-cni-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772032 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-k8s-cni-cncf-io\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772088 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-multus-certs\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772096 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-conf-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772096 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-hostroot\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772146 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-cnibin\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772206 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cnibin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772688 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-daemon-config\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772726 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cnibin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772780 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlv87\" (UniqueName: \"kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772797 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772811 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-bin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.772970 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-bin\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773088 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-kubelet\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773142 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-kubelet\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773112 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-socket-dir-parent\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773175 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-os-release\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773192 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773238 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-socket-dir-parent\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773329 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773405 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-os-release\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773444 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773459 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cni-binary-copy\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773585 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66a6f849-6870-4e28-8ca4-c849c22ad252-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773621 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773901 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773929 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773946 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773961 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9vv\" (UniqueName: \"kubernetes.io/projected/66a6f849-6870-4e28-8ca4-c849c22ad252-kube-api-access-gf9vv\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773980 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-system-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.773995 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-binary-copy\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774008 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774022 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774034 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774050 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774063 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774099 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774106 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774156 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774175 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-system-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774176 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774195 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774262 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774299 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774302 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774318 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774338 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774369 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774385 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-os-release\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774401 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-netns\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774419 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774461 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774479 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774495 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-etc-kubernetes\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774515 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774531 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774697 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774796 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774817 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-multus\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774801 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-var-lib-cni-multus\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774847 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm92t\" (UniqueName: \"kubernetes.io/projected/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-kube-api-access-nm92t\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.774852 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-os-release\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775014 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775020 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66a6f849-6870-4e28-8ca4-c849c22ad252-cni-binary-copy\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775044 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775068 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775099 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775104 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775121 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-host-run-netns\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775127 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775148 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775171 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775195 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-etc-kubernetes\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775251 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-multus-cni-dir\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775390 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-cni-binary-copy\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775464 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.775638 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.778708 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.779805 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.802332 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9vv\" (UniqueName: \"kubernetes.io/projected/66a6f849-6870-4e28-8ca4-c849c22ad252-kube-api-access-gf9vv\") pod \"multus-additional-cni-plugins-zdfsz\" (UID: \"66a6f849-6870-4e28-8ca4-c849c22ad252\") " pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.807871 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm92t\" (UniqueName: \"kubernetes.io/projected/96bf1ee1-9249-4db9-83da-17eb15aa5f9a-kube-api-access-nm92t\") pod \"multus-rwqt7\" (UID: \"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\") " pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.810433 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlv87\" (UniqueName: \"kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87\") pod \"ovnkube-node-zc9xl\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.813945 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.843692 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.845850 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.852062 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rwqt7" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.858466 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.864046 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.875479 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: W1003 00:07:11.878314 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96bf1ee1_9249_4db9_83da_17eb15aa5f9a.slice/crio-5bac065f410171736b77e10f8559dc6875aed7502f78df6fb18ad2e011f0311a WatchSource:0}: Error finding container 5bac065f410171736b77e10f8559dc6875aed7502f78df6fb18ad2e011f0311a: Status 404 returned error can't find the container with id 5bac065f410171736b77e10f8559dc6875aed7502f78df6fb18ad2e011f0311a Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.904974 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.922406 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.937840 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.949730 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.960096 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: W1003 00:07:11.963211 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f189ec1_8169_4886_bff6_f67f7958050e.slice/crio-893a5dd7a6050b0f460f109807438c3564225c0c9254db0c028cc87736b5b8d7 WatchSource:0}: Error finding container 893a5dd7a6050b0f460f109807438c3564225c0c9254db0c028cc87736b5b8d7: Status 404 returned error can't find the container with id 893a5dd7a6050b0f460f109807438c3564225c0c9254db0c028cc87736b5b8d7 Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.980495 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:11 crc kubenswrapper[4832]: I1003 00:07:11.994311 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:11Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.005274 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.016852 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.029265 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.398870 4832 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.400616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.400661 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.400671 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.400774 4832 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.407537 4832 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.407811 4832 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.408740 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.408792 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.408805 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.408823 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.408837 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.425660 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.428720 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.428770 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.428783 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.428799 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.428811 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.439419 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.442096 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.442132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.442143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.442157 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.442166 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.454341 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.457290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.457323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.457334 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.457349 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.457360 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.470795 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.474087 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.474134 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.474143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.474159 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.474168 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.485589 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: E1003 00:07:12.485767 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.487460 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.487494 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.487505 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.487519 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.487529 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.541507 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.541553 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.542959 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81" exitCode=0 Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.543019 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.543040 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerStarted","Data":"2783b51b60c842b817182e2b24f2e0f2cc1da9487e208327360f45e81a45ab0f"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.544238 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b9c9g" event={"ID":"4369beec-ad3f-4de9-909b-7d0f8c76aad3","Type":"ContainerStarted","Data":"898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.547926 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" exitCode=0 Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.547982 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.548008 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"893a5dd7a6050b0f460f109807438c3564225c0c9254db0c028cc87736b5b8d7"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.551050 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerStarted","Data":"778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.551104 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerStarted","Data":"5bac065f410171736b77e10f8559dc6875aed7502f78df6fb18ad2e011f0311a"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.559575 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.579034 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.591047 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.591093 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.591101 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.591115 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.591124 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.596675 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.612294 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.627473 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.641320 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.657060 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.673032 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.693169 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.694416 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.694468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.694481 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.694500 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.694515 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.710890 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.738209 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.748045 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.764275 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.779133 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.797174 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.797219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.797231 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.797246 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.797256 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.803013 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.814509 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.829571 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.841117 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.853004 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.863714 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.899769 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.899817 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.899829 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.899844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.899866 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:12Z","lastTransitionTime":"2025-10-03T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.922536 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.958267 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.969733 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.982129 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:12 crc kubenswrapper[4832]: I1003 00:07:12.994066 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:12Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.002186 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.002227 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.002237 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.002252 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.002262 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.007155 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.019113 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.033242 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.105172 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.105214 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.105224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.105238 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.105247 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.207410 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.207450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.207462 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.207477 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.207486 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.311008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.311045 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.311058 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.311072 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.311083 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.385640 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.385798 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.385874 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.385920 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.386193 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.386247 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.392101 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-76ck7"] Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.397169 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.399028 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.400262 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.401148 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.401896 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.413464 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.413505 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.413517 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.413534 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.413546 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.418200 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.429898 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.446914 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.458742 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.472286 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.483276 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.494291 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b962b7f-e442-4e25-97f4-21f16b9f1324-serviceca\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.494342 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b962b7f-e442-4e25-97f4-21f16b9f1324-host\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.494386 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksm2n\" (UniqueName: \"kubernetes.io/projected/1b962b7f-e442-4e25-97f4-21f16b9f1324-kube-api-access-ksm2n\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.498551 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.514192 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.516616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.516676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.516688 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.516705 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.516728 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.525836 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.540253 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.556887 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd" exitCode=0 Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.556960 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.557068 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.561203 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.561224 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.561234 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.572870 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.589348 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.595073 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b962b7f-e442-4e25-97f4-21f16b9f1324-serviceca\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.595106 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b962b7f-e442-4e25-97f4-21f16b9f1324-host\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.595125 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksm2n\" (UniqueName: \"kubernetes.io/projected/1b962b7f-e442-4e25-97f4-21f16b9f1324-kube-api-access-ksm2n\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.595269 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b962b7f-e442-4e25-97f4-21f16b9f1324-host\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.596244 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b962b7f-e442-4e25-97f4-21f16b9f1324-serviceca\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.609122 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.619198 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.619232 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.619243 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.619255 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.619263 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.630395 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksm2n\" (UniqueName: \"kubernetes.io/projected/1b962b7f-e442-4e25-97f4-21f16b9f1324-kube-api-access-ksm2n\") pod \"node-ca-76ck7\" (UID: \"1b962b7f-e442-4e25-97f4-21f16b9f1324\") " pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.631566 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.642705 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.664677 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.686833 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.699685 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.714259 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.721625 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.721654 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.721662 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.721676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.721684 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.729357 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.739148 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-76ck7" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.743258 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: W1003 00:07:13.752547 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b962b7f_e442_4e25_97f4_21f16b9f1324.slice/crio-2def7036513ac34e4695ed9ecdfbd94b4d4b18b9cddc8846ac397a602909f368 WatchSource:0}: Error finding container 2def7036513ac34e4695ed9ecdfbd94b4d4b18b9cddc8846ac397a602909f368: Status 404 returned error can't find the container with id 2def7036513ac34e4695ed9ecdfbd94b4d4b18b9cddc8846ac397a602909f368 Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.769255 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.796775 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.796891 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.796917 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.796945 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.796988 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797077 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797122 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:21.797109943 +0000 UTC m=+35.059265049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797306 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:21.797297569 +0000 UTC m=+35.059452675 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797362 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797372 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797381 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797404 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:21.797397762 +0000 UTC m=+35.059552858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797443 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797452 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797458 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797475 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:21.797469625 +0000 UTC m=+35.059624731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797500 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: E1003 00:07:13.797538 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:21.797530777 +0000 UTC m=+35.059685883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.801032 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.824591 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.824630 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.824640 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.824654 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.824665 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.841628 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.886055 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.920311 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.926957 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.927004 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.927014 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.927038 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.927047 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:13Z","lastTransitionTime":"2025-10-03T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:13 crc kubenswrapper[4832]: I1003 00:07:13.962433 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.000139 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:13Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.029292 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.029345 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.029356 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.029371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.029380 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.039687 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.131748 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.131817 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.131828 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.131846 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.131856 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.234688 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.234723 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.234732 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.234747 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.234776 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.337150 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.337193 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.337204 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.337219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.337230 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.440422 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.440495 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.440513 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.440546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.440564 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.542588 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.542616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.542626 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.542637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.542657 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.568641 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb" exitCode=0 Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.568689 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.578048 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-76ck7" event={"ID":"1b962b7f-e442-4e25-97f4-21f16b9f1324","Type":"ContainerStarted","Data":"0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.578133 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-76ck7" event={"ID":"1b962b7f-e442-4e25-97f4-21f16b9f1324","Type":"ContainerStarted","Data":"2def7036513ac34e4695ed9ecdfbd94b4d4b18b9cddc8846ac397a602909f368"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.584472 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.584519 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.584537 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.591274 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.608673 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.638043 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.647952 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.648002 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.648033 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.648048 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.648057 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.652783 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.665568 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.677952 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.688867 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.705595 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.717482 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.730246 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.744435 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.751195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.751227 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.751237 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.751251 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.751262 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.760484 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.769860 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.783338 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.796388 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.806548 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.819186 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.832785 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.843629 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.853532 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.853566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.853575 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.853590 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.853600 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.861254 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.880941 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.919827 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.955119 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.955165 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.955177 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.955191 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.955202 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:14Z","lastTransitionTime":"2025-10-03T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:14 crc kubenswrapper[4832]: I1003 00:07:14.960220 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.001264 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.047469 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.058082 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.058116 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.058129 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.058146 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.058157 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.083119 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.129392 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160543 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160582 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160594 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160609 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160620 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.160968 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.200370 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.237955 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.263323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.263403 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.263416 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.263434 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.263446 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.366549 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.366623 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.366634 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.366649 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.366659 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.385999 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:15 crc kubenswrapper[4832]: E1003 00:07:15.386130 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.386301 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:15 crc kubenswrapper[4832]: E1003 00:07:15.386452 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.386503 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:15 crc kubenswrapper[4832]: E1003 00:07:15.386645 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.468823 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.468863 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.468871 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.468885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.468895 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.571903 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.571957 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.571975 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.572000 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.572022 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.590745 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720" exitCode=0 Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.590816 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.605505 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.622368 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.634666 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.644676 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.661536 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.671493 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.675113 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.675143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.675154 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.675169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.675182 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.685838 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.696386 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.707635 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.716308 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.727654 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.740212 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.759901 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.781489 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.781539 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.781567 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.781589 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.781601 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.798783 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.845122 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:15Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.885939 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.885988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.886009 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.886028 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.886039 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.988827 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.988900 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.988914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.988941 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:15 crc kubenswrapper[4832]: I1003 00:07:15.988955 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:15Z","lastTransitionTime":"2025-10-03T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.091099 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.091142 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.091151 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.091169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.091181 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.194240 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.194288 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.194301 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.194316 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.194326 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.296927 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.296983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.296993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.297008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.297018 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.399850 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.399897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.399905 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.399917 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.399926 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.503130 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.503181 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.503193 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.503214 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.503227 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.599464 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.602421 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerStarted","Data":"d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.615079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.615147 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.615165 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.615188 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.615206 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.630364 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.648784 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.662279 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.679348 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.699865 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.716804 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.718805 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.718846 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.718862 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.718886 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.718903 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.734822 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.757674 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.776574 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.798699 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.813788 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.820824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.821066 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.821253 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.821428 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.821592 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.833127 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.853103 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.868635 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.896421 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:16Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.924647 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.925027 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.925218 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.925421 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:16 crc kubenswrapper[4832]: I1003 00:07:16.925551 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:16Z","lastTransitionTime":"2025-10-03T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.029350 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.029398 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.029412 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.029432 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.029447 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.131868 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.131948 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.131968 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.132002 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.132026 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.234507 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.234578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.234587 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.234602 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.234611 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.337018 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.337662 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.337777 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.337861 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.337935 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.386150 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.386218 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.386501 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:17 crc kubenswrapper[4832]: E1003 00:07:17.386787 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:17 crc kubenswrapper[4832]: E1003 00:07:17.386867 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:17 crc kubenswrapper[4832]: E1003 00:07:17.386987 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.408735 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.420165 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.438530 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.439927 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.440023 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.440090 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.440151 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.440206 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.461239 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.476058 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.497817 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.514737 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.535479 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.542550 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.542586 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.542609 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.542623 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.542632 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.551070 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.565784 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.578958 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.595067 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.606874 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.610246 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e" exitCode=0 Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.610300 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.620374 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.638106 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.644992 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.645035 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.645045 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.645062 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.645074 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.653909 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.665480 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.679483 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.692918 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.707822 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.722094 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.737010 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.748226 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.748264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.748274 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.748290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.748301 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.759516 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.772825 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.787519 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.801457 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.818499 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.831518 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.841130 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.849819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.849844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.849853 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.849865 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.849876 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.858947 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.954110 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.954154 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.954164 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.954181 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:17 crc kubenswrapper[4832]: I1003 00:07:17.954191 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:17Z","lastTransitionTime":"2025-10-03T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.056372 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.056426 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.056445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.056471 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.056491 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.159140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.159468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.159479 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.159495 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.159507 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.262821 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.262875 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.262892 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.262912 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.262927 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.366140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.366202 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.366219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.366247 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.366265 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.469189 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.469281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.469311 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.469346 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.469373 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.571580 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.571645 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.571658 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.571680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.571694 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.622106 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.622514 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.630634 4832 generic.go:334] "Generic (PLEG): container finished" podID="66a6f849-6870-4e28-8ca4-c849c22ad252" containerID="7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7" exitCode=0 Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.630683 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerDied","Data":"7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.641775 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.648580 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.659738 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.681351 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.681399 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.681411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.681433 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.681445 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.688974 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.711078 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.729772 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.744672 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.750118 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.768372 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.783828 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.784678 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.784713 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.784724 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.784739 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.784763 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.796557 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.810694 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.829259 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.850624 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.863433 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.874357 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.889864 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.889899 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.889910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.889928 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.890011 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.898445 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.910786 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.926346 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.945696 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.961212 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.971006 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.984120 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.991842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.992113 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.992219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.992356 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.992510 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:18Z","lastTransitionTime":"2025-10-03T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:18 crc kubenswrapper[4832]: I1003 00:07:18.997860 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:18Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.015228 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.031606 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.044792 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.057619 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.072516 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.084232 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.095547 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.095632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.095647 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.095671 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.095686 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.111018 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.124365 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.198079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.198400 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.198467 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.198533 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.198594 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.301142 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.301182 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.301195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.301211 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.301226 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.386044 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:19 crc kubenswrapper[4832]: E1003 00:07:19.386178 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.386238 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:19 crc kubenswrapper[4832]: E1003 00:07:19.386308 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.386044 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:19 crc kubenswrapper[4832]: E1003 00:07:19.386362 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.404103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.404147 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.404166 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.404193 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.404211 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.507176 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.507250 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.507272 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.507304 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.507330 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.611001 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.611079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.611102 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.611132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.611160 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.638790 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" event={"ID":"66a6f849-6870-4e28-8ca4-c849c22ad252","Type":"ContainerStarted","Data":"53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.639566 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.661803 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.670671 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.682192 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.711630 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.714185 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.714220 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.714233 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.714251 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.714279 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.729501 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.744428 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.759132 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.776442 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.791044 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.801745 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.816315 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.816342 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.816352 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.816365 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.816374 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.821141 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.838541 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.858566 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.871818 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.885003 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.899315 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.918064 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.918088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.918097 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.918109 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.918117 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:19Z","lastTransitionTime":"2025-10-03T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.919786 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.932691 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.945652 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.957411 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.972106 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.983398 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:19 crc kubenswrapper[4832]: I1003 00:07:19.995446 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:19Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.009121 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.022333 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.023610 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.023632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.023642 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.023655 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.023663 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.033650 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.053112 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.071146 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.084618 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.098876 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.109916 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:20Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.125517 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.125546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.125554 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.125568 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.125578 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.228192 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.228224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.228234 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.228259 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.228271 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.330709 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.330767 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.330779 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.330796 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.330809 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.433505 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.433550 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.433561 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.433578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.433590 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.537488 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.537530 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.537539 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.537553 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.537562 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.645910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.645957 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.645966 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.645981 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.645991 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.748817 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.748888 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.748906 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.748929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.748947 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.851480 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.851525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.851533 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.851550 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.851560 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.953771 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.953837 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.953851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.953870 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:20 crc kubenswrapper[4832]: I1003 00:07:20.953881 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:20Z","lastTransitionTime":"2025-10-03T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.056056 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.056119 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.056132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.056154 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.056166 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.159283 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.159361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.159399 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.159436 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.159453 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.263984 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.264034 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.264046 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.264063 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.264078 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.367447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.367512 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.367525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.367546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.367562 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.386140 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.386245 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.386270 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.386408 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.386561 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.386809 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.471716 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.471791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.471803 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.471820 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.471833 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.573517 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.573557 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.573566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.573580 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.573589 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.645162 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/0.log" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.647144 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6" exitCode=1 Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.647174 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.647984 4832 scope.go:117] "RemoveContainer" containerID="1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.666671 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.675249 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.675284 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.675294 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.675307 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.675316 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.679477 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.691802 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.702912 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.719106 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.733045 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.747154 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.760828 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.775439 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.780515 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.780543 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.780553 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.780574 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.780585 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.788258 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.801851 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.816471 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.830554 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.843071 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.861070 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:21Z\\\",\\\"message\\\":\\\"7:21.483814 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 00:07:21.483821 6155 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 00:07:21.483930 6155 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1003 00:07:21.484010 6155 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 00:07:21.484089 6155 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484132 6155 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484234 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484416 6155 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:21Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.884386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.884425 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.884437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.884453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.884466 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.894930 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.895097 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.895147 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.895201 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.895235 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895277 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:07:37.895252575 +0000 UTC m=+51.157407701 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895295 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895314 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895327 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895353 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895367 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:37.895356629 +0000 UTC m=+51.157511735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895509 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:37.895497444 +0000 UTC m=+51.157652560 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895396 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895530 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895542 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895569 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:37.895560706 +0000 UTC m=+51.157715822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895426 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: E1003 00:07:21.895602 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:37.895595077 +0000 UTC m=+51.157750193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.986767 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.986810 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.986819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.986831 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:21 crc kubenswrapper[4832]: I1003 00:07:21.986839 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:21Z","lastTransitionTime":"2025-10-03T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.089688 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.089725 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.089734 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.089772 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.089784 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.191745 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.191807 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.191819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.191836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.191849 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.294409 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.294447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.294455 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.294469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.294478 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.397001 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.397042 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.397053 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.397073 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.397086 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.499476 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.499516 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.499529 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.499545 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.499554 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.602295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.602326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.602334 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.602347 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.602357 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.615019 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.615065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.615073 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.615089 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.615099 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.635644 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.640148 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.640293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.640377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.640447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.640514 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.653288 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/0.log" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.657687 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.658169 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.658793 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.663733 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.663789 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.663800 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.663813 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.663825 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.678447 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.683775 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.683840 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.683860 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.683886 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.683902 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.689944 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.699846 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.704509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.704575 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.704593 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.704619 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.704638 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.711315 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.721164 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: E1003 00:07:22.721515 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.723630 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.723787 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.723892 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.723986 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.724065 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.731908 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.744516 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.757415 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.765894 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.782548 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.796881 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.810223 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.820880 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.826196 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.826240 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.826254 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.826272 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.826285 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.840614 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:21Z\\\",\\\"message\\\":\\\"7:21.483814 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 00:07:21.483821 6155 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 00:07:21.483930 6155 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1003 00:07:21.484010 6155 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 00:07:21.484089 6155 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484132 6155 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484234 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484416 6155 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.854887 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.866906 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.882449 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.894245 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:22Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.929190 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.929497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.929506 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.929520 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:22 crc kubenswrapper[4832]: I1003 00:07:22.929529 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:22Z","lastTransitionTime":"2025-10-03T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.032719 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.032781 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.032793 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.032810 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.032821 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.135305 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.135343 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.135354 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.135374 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.135385 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.238133 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.238190 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.238209 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.238229 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.238244 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.340736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.340793 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.340832 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.340853 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.340865 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.386224 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.386268 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:23 crc kubenswrapper[4832]: E1003 00:07:23.386372 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.386436 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:23 crc kubenswrapper[4832]: E1003 00:07:23.386470 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:23 crc kubenswrapper[4832]: E1003 00:07:23.386623 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.449143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.449198 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.449213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.449234 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.449247 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.552388 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.552445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.552459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.552483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.552499 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.660311 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.660417 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.660444 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.660497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.660530 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.664798 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/1.log" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.665802 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/0.log" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.669699 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e" exitCode=1 Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.669742 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.669817 4832 scope.go:117] "RemoveContainer" containerID="1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.671144 4832 scope.go:117] "RemoveContainer" containerID="4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e" Oct 03 00:07:23 crc kubenswrapper[4832]: E1003 00:07:23.671553 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.686400 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.708186 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.725230 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.736480 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.757554 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:21Z\\\",\\\"message\\\":\\\"7:21.483814 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 00:07:21.483821 6155 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 00:07:21.483930 6155 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1003 00:07:21.484010 6155 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 00:07:21.484089 6155 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484132 6155 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484234 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484416 6155 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.765065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.765118 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.765131 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.765153 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.765164 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.774595 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.796718 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.814849 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.833639 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.848877 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868508 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868557 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868574 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868585 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.868573 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.881745 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.903895 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.917853 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.931383 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:23Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.971673 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.971796 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.971815 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.971842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:23 crc kubenswrapper[4832]: I1003 00:07:23.971862 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:23Z","lastTransitionTime":"2025-10-03T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.074330 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.074411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.074435 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.074469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.074488 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.143389 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh"] Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.143931 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.146119 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.146131 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.161531 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.172569 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.176583 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.176630 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.176641 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.176660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.176674 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.193855 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1961487ddf2d62133f833782a0d3d51915a06b7f33a8b93748baea1713f0edb6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:21Z\\\",\\\"message\\\":\\\"7:21.483814 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 00:07:21.483821 6155 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 00:07:21.483930 6155 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1003 00:07:21.484010 6155 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 00:07:21.484089 6155 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484132 6155 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484234 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 00:07:21.484416 6155 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.210082 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.218725 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7262a85e-7e44-4422-b1d2-5d045503e06a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.218889 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.218938 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.219043 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhr8\" (UniqueName: \"kubernetes.io/projected/7262a85e-7e44-4422-b1d2-5d045503e06a-kube-api-access-vwhr8\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.223715 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.238049 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.255869 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.270070 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.279240 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.279288 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.279304 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.279326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.279340 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.290224 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.306021 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.320468 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhr8\" (UniqueName: \"kubernetes.io/projected/7262a85e-7e44-4422-b1d2-5d045503e06a-kube-api-access-vwhr8\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.320636 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7262a85e-7e44-4422-b1d2-5d045503e06a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.320678 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.320711 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.321954 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.322027 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7262a85e-7e44-4422-b1d2-5d045503e06a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.323083 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.334961 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7262a85e-7e44-4422-b1d2-5d045503e06a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.344999 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.351790 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhr8\" (UniqueName: \"kubernetes.io/projected/7262a85e-7e44-4422-b1d2-5d045503e06a-kube-api-access-vwhr8\") pod \"ovnkube-control-plane-749d76644c-4j5bh\" (UID: \"7262a85e-7e44-4422-b1d2-5d045503e06a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.366143 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.382275 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.382317 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.382326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.382341 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.382354 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.385308 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.399381 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.418708 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.460338 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" Oct 03 00:07:24 crc kubenswrapper[4832]: W1003 00:07:24.484606 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7262a85e_7e44_4422_b1d2_5d045503e06a.slice/crio-d310ecf66fd688e517beb8279e1e84a401a2e122ac24d19bad4115e3c72790cf WatchSource:0}: Error finding container d310ecf66fd688e517beb8279e1e84a401a2e122ac24d19bad4115e3c72790cf: Status 404 returned error can't find the container with id d310ecf66fd688e517beb8279e1e84a401a2e122ac24d19bad4115e3c72790cf Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.487409 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.487471 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.487488 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.487514 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.487530 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.591021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.591066 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.591077 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.591097 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.591112 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.674602 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" event={"ID":"7262a85e-7e44-4422-b1d2-5d045503e06a","Type":"ContainerStarted","Data":"d310ecf66fd688e517beb8279e1e84a401a2e122ac24d19bad4115e3c72790cf"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.684055 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/1.log" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.692061 4832 scope.go:117] "RemoveContainer" containerID="4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e" Oct 03 00:07:24 crc kubenswrapper[4832]: E1003 00:07:24.692716 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.697083 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.697127 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.697141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.697163 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.697178 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.710818 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.723705 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.748216 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.767569 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.790844 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.802952 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.803049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.803078 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.803115 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.803150 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.811797 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.830534 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.854857 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.871303 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.890976 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.905548 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.905852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.905912 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.905991 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.906047 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:24Z","lastTransitionTime":"2025-10-03T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.906394 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.922498 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.938683 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.949606 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.955428 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.971872 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:24 crc kubenswrapper[4832]: I1003 00:07:24.988544 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:24Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.003483 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.008438 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.008579 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.008705 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.008840 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.008919 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.019685 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.035335 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.050428 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.073347 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.095216 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.111164 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.111348 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.111419 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.111485 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.111549 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.116651 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.130324 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.143131 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.153564 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.165230 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.187059 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.201065 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.213605 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.213903 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.214066 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.214159 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.214227 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.214244 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.226614 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.238094 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.317019 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.317056 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.317065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.317079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.317089 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.386287 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.386399 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.386299 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.386290 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.386463 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.386590 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.418910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.418949 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.418958 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.418972 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.418983 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.521295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.521576 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.521682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.521798 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.521893 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.623690 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.623857 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.623942 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.624057 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.624148 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.638229 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ppthn"] Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.642007 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.642161 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.654051 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.668772 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.682134 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.695015 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" event={"ID":"7262a85e-7e44-4422-b1d2-5d045503e06a","Type":"ContainerStarted","Data":"f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.695060 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" event={"ID":"7262a85e-7e44-4422-b1d2-5d045503e06a","Type":"ContainerStarted","Data":"afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.695403 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.705324 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.718491 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.725659 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.725934 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.726065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.726290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.726463 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.731684 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.735103 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6t5s\" (UniqueName: \"kubernetes.io/projected/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-kube-api-access-j6t5s\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.735413 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.749845 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.762282 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.792168 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.804264 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.819038 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.829419 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.829520 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.829532 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.829557 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.829570 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.834644 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.836246 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.836307 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6t5s\" (UniqueName: \"kubernetes.io/projected/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-kube-api-access-j6t5s\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.836455 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:25 crc kubenswrapper[4832]: E1003 00:07:25.836556 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:26.336536564 +0000 UTC m=+39.598691670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.848810 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.852688 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6t5s\" (UniqueName: \"kubernetes.io/projected/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-kube-api-access-j6t5s\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.865441 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.875026 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.891081 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.901916 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.913798 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.925901 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.931914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.931948 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.931958 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.931978 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.931989 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:25Z","lastTransitionTime":"2025-10-03T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.944804 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.955806 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.966523 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:25 crc kubenswrapper[4832]: I1003 00:07:25.978430 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.002263 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:25Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.033763 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.035257 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.035290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.035303 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.035320 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.035331 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.050744 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.060620 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.070694 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.087564 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.097930 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.109334 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.120004 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.131159 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:26Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.138016 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.138053 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.138061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.138077 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.138086 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.240420 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.240461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.240469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.240485 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.240494 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.340385 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:26 crc kubenswrapper[4832]: E1003 00:07:26.340559 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:26 crc kubenswrapper[4832]: E1003 00:07:26.340624 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:27.340607724 +0000 UTC m=+40.602762830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.342142 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.342177 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.342185 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.342198 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.342207 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.444073 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.444114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.444123 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.444135 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.444143 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.546087 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.546121 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.546129 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.546141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.546150 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.648215 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.648247 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.648259 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.648282 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.648293 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.749889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.749921 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.749929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.749942 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.749950 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.851720 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.852033 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.852046 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.852061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.852070 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.958897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.958945 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.958955 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.958970 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:26 crc kubenswrapper[4832]: I1003 00:07:26.958981 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:26Z","lastTransitionTime":"2025-10-03T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.061373 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.061411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.061424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.061439 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.061448 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.163693 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.163776 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.163787 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.163802 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.163814 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.266077 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.266116 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.266125 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.266139 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.266152 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.351703 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.351870 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.351965 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:29.351945488 +0000 UTC m=+42.614100644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.368439 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.368477 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.368485 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.368501 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.368510 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.385805 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.385912 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.385978 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.386109 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.385805 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.386217 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.385915 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:27 crc kubenswrapper[4832]: E1003 00:07:27.386325 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.399485 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.410908 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.420899 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.430584 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.440815 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.469223 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.470466 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.470487 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.470494 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.470509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.470521 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.479584 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.490011 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.499778 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.510908 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.520374 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.528244 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.546812 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.558204 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572328 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572528 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572545 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.572597 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.589530 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.600717 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:27Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.674660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.674705 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.674716 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.674734 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.674746 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.777079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.777120 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.777132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.777147 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.777156 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.878875 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.878909 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.878921 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.878938 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.878949 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.981164 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.981206 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.981216 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.981230 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:27 crc kubenswrapper[4832]: I1003 00:07:27.981240 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:27Z","lastTransitionTime":"2025-10-03T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.083567 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.083604 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.083613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.083627 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.083637 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.186221 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.186259 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.186270 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.186283 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.186294 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.288029 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.288069 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.288079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.288095 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.288104 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.390674 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.390941 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.391027 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.391113 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.391225 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.493412 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.493888 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.494088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.494282 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.494503 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.596348 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.596382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.596395 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.596411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.596422 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.698567 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.698613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.698629 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.698651 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.698665 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.800469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.800523 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.800539 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.800560 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.800575 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.903114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.903172 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.903186 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.903205 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:28 crc kubenswrapper[4832]: I1003 00:07:28.903218 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:28Z","lastTransitionTime":"2025-10-03T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.006091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.006144 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.006160 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.006181 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.006194 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.108247 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.108281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.108289 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.108301 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.108311 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.211120 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.211167 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.211180 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.211196 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.211206 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.317812 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.317851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.317864 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.317883 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.317911 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.370706 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.370929 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.371148 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:33.371129141 +0000 UTC m=+46.633284247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.386248 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.386361 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.386547 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.386658 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.386715 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.386778 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.386877 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:29 crc kubenswrapper[4832]: E1003 00:07:29.387026 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.420329 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.420380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.420389 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.420406 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.420417 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.523184 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.523222 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.523232 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.523247 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.523256 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.625786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.625842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.625881 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.625905 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.625919 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.727971 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.728208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.728320 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.728448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.728559 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.830618 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.830662 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.830678 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.830699 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.830713 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.933402 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.933457 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.933473 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.933492 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:29 crc kubenswrapper[4832]: I1003 00:07:29.933506 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:29Z","lastTransitionTime":"2025-10-03T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.035785 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.035817 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.035826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.035839 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.035847 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.138414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.138450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.138458 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.138471 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.138480 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.240971 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.241014 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.241031 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.241052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.241068 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.343636 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.343682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.343694 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.343710 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.343721 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.446900 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.446974 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.446986 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.447034 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.447048 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.550740 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.550840 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.550852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.550883 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.550894 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.653273 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.653312 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.653322 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.653337 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.653348 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.755697 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.755731 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.755742 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.755790 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.755802 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.858365 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.858394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.858401 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.858413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.858421 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.960398 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.960448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.960480 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.960498 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:30 crc kubenswrapper[4832]: I1003 00:07:30.960510 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:30Z","lastTransitionTime":"2025-10-03T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.062203 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.062268 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.062287 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.062312 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.062330 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.164691 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.164787 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.164805 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.164829 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.164846 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.268437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.268549 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.268566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.268591 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.268612 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.370994 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.371030 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.371045 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.371061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.371075 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.386497 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.386547 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.386592 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:31 crc kubenswrapper[4832]: E1003 00:07:31.386631 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.386494 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:31 crc kubenswrapper[4832]: E1003 00:07:31.386819 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:31 crc kubenswrapper[4832]: E1003 00:07:31.386944 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:31 crc kubenswrapper[4832]: E1003 00:07:31.387067 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.474728 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.474798 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.474810 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.474826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.474843 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.579014 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.579092 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.579112 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.579141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.579163 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.681879 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.682173 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.682309 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.682496 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.682662 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.785808 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.785852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.785866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.785897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.785917 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.887841 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.887889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.887903 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.887921 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.887934 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.989950 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.989984 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.989992 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.990007 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:31 crc kubenswrapper[4832]: I1003 00:07:31.990016 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:31Z","lastTransitionTime":"2025-10-03T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.092501 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.092546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.092556 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.092573 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.092588 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.196277 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.196554 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.196734 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.196953 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.197121 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.299413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.299637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.299699 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.299786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.299900 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.401950 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.401994 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.402002 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.402016 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.402025 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.504276 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.504316 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.504326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.504339 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.504349 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.606370 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.606417 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.606429 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.606448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.606459 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.708560 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.708591 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.708600 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.708613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.708622 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.811021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.811049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.811057 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.811069 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.811077 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.912930 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.912960 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.912968 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.912980 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:32 crc kubenswrapper[4832]: I1003 00:07:32.912989 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:32Z","lastTransitionTime":"2025-10-03T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.015706 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.015866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.015886 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.015944 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.015965 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.041576 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.041633 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.041644 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.041661 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.041672 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.057871 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:33Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.061120 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.061159 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.061169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.061183 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.061192 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.078689 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:33Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.082119 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.082293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.082368 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.082452 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.082525 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.097469 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:33Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.101329 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.101372 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.101387 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.101407 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.101423 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.118261 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:33Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.122786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.122824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.122835 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.122853 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.122866 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.136238 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:33Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.136394 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.137927 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.137983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.138003 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.138026 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.138041 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.244201 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.244502 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.244629 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.244744 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.244896 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.347672 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.347962 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.348026 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.348088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.348152 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.386250 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.386313 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.386672 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.386373 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.386671 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.386370 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.386794 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.386918 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.416001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.416188 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:33 crc kubenswrapper[4832]: E1003 00:07:33.416342 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:41.41632494 +0000 UTC m=+54.678480036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.450220 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.450433 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.450497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.450559 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.450620 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.552940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.552988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.552999 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.553017 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.553029 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.655303 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.655396 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.655418 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.655440 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.655456 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.758128 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.758180 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.758191 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.758206 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.758215 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.860635 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.860689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.860701 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.860720 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.860731 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.963364 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.963404 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.963414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.963430 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:33 crc kubenswrapper[4832]: I1003 00:07:33.963441 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:33Z","lastTransitionTime":"2025-10-03T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.066022 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.066096 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.066106 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.066124 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.066135 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.168553 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.168612 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.168629 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.168650 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.168667 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.271281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.271396 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.271429 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.271459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.271482 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.373451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.373489 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.373498 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.373511 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.373521 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.475329 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.475373 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.475387 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.475407 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.475420 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.577859 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.577900 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.577908 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.577923 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.577932 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.679979 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.680008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.680016 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.680027 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.680036 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.781786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.781836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.781848 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.781867 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.781881 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.884571 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.884608 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.884621 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.884638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.884648 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.987074 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.987128 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.987140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.987155 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:34 crc kubenswrapper[4832]: I1003 00:07:34.987167 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:34Z","lastTransitionTime":"2025-10-03T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.089578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.089640 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.089657 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.089682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.089703 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.192629 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.192721 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.192741 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.192786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.192804 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.295791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.295832 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.295842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.295859 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.295868 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.385855 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.385972 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:35 crc kubenswrapper[4832]: E1003 00:07:35.386075 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.386161 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:35 crc kubenswrapper[4832]: E1003 00:07:35.386322 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.387879 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.387918 4832 scope.go:117] "RemoveContainer" containerID="4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e" Oct 03 00:07:35 crc kubenswrapper[4832]: E1003 00:07:35.388051 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:35 crc kubenswrapper[4832]: E1003 00:07:35.388476 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.402293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.402468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.402490 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.402516 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.402535 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.512738 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.512811 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.512824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.512842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.512854 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.616073 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.616117 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.616132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.616154 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.616171 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.719465 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.719715 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.719732 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.719769 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.719780 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.729388 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/1.log" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.732652 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.734040 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.749399 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.762339 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.790669 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.808816 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.822474 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.822541 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.822557 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.822576 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.822627 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.833696 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.847355 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.858371 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.871811 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.880674 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.897633 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.920495 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.925137 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.925180 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.925189 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.925202 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.925210 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:35Z","lastTransitionTime":"2025-10-03T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.941214 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.954271 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.967787 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.983328 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:35 crc kubenswrapper[4832]: I1003 00:07:35.999279 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:35Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.014509 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.028598 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.028648 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.028663 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.028683 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.028700 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.130861 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.130898 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.130908 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.130924 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.130936 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.232890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.232921 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.232929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.232943 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.232951 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.334850 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.334889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.334899 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.334914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.334924 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.437351 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.437389 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.437399 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.437414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.437424 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.540266 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.540305 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.540317 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.540333 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.540345 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.642538 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.642574 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.642584 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.642598 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.642609 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.738463 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/2.log" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.739035 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/1.log" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.742080 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" exitCode=1 Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.742111 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.742143 4832 scope.go:117] "RemoveContainer" containerID="4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.743396 4832 scope.go:117] "RemoveContainer" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" Oct 03 00:07:36 crc kubenswrapper[4832]: E1003 00:07:36.743729 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.744316 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.744339 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.744350 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.744362 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.744373 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.759720 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.769276 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.788245 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.799321 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.812616 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.826063 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.837998 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.846128 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.846280 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.846358 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.846424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.846493 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.849526 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.860518 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.868855 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.879169 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.887696 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.909268 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.921347 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.935915 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.949158 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.949228 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.949239 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.949258 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.949268 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:36Z","lastTransitionTime":"2025-10-03T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.951713 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:36 crc kubenswrapper[4832]: I1003 00:07:36.969700 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:36Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.052089 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.052126 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.052137 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.052152 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.052163 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.154342 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.154380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.154390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.154405 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.154416 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.256724 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.257135 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.257200 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.257263 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.257333 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.360103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.360158 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.360170 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.360186 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.360197 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.385945 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.386031 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.386093 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.386142 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.386593 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.386938 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.387272 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.387600 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.401142 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.413887 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.429890 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.443038 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.457416 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.462267 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.462295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.462306 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.462321 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.462332 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.474502 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.489344 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.502232 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.522704 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.534211 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.548960 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.561652 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.564637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.564683 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.564693 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.564707 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.564716 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.573497 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.589584 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.602500 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.614144 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.631630 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f68e28fe1f55d438c3d42880d9ed4a20367c2e4bd5e8636372586dfa489868e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:22Z\\\",\\\"message\\\":\\\"s:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 00:07:22.607875 6290 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.666801 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.666843 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.666854 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.666868 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.666878 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.746128 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/2.log" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.749160 4832 scope.go:117] "RemoveContainer" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.749284 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.764233 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.768722 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.768925 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.769025 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.769129 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.769217 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.779489 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.793896 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.805512 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.825428 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.848323 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.872278 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.873935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.873971 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.873981 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.873999 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.874010 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.891185 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.904461 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.915277 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.934650 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.947348 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.957422 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.957519 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.957542 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.957566 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.957583 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957639 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957678 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:08:09.957666061 +0000 UTC m=+83.219821167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957839 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:08:09.957831356 +0000 UTC m=+83.219986452 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957901 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957912 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957921 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.957941 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:08:09.95793594 +0000 UTC m=+83.220091046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958070 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958106 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958120 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958070 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958177 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:08:09.958158167 +0000 UTC m=+83.220313283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:07:37 crc kubenswrapper[4832]: E1003 00:07:37.958235 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:08:09.958213039 +0000 UTC m=+83.220368185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.958583 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.970816 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.976524 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.976577 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.976602 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.976632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.976655 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:37Z","lastTransitionTime":"2025-10-03T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.985137 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:37 crc kubenswrapper[4832]: I1003 00:07:37.999024 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:37Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.012231 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:38Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.078725 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.078791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.078807 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.078823 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.078834 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.181382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.181449 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.181461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.181478 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.181490 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.284190 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.284254 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.284264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.284294 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.284304 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.386716 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.386800 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.386819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.386844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.386863 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.489165 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.489235 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.489261 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.489291 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.489313 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.592113 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.592179 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.592195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.592226 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.592242 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.694933 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.694986 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.695000 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.695017 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.695030 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.797308 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.797378 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.797413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.797441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.797461 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.900088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.900155 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.900187 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.900217 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:38 crc kubenswrapper[4832]: I1003 00:07:38.900238 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:38Z","lastTransitionTime":"2025-10-03T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.002850 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.002890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.002901 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.002916 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.002927 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.114005 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.114061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.114079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.114102 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.114119 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.216979 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.217013 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.217022 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.217036 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.217044 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.319885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.319965 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.319983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.320010 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.320026 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.385872 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.385923 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.385955 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:39 crc kubenswrapper[4832]: E1003 00:07:39.386057 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.386082 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:39 crc kubenswrapper[4832]: E1003 00:07:39.386217 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:39 crc kubenswrapper[4832]: E1003 00:07:39.386395 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:39 crc kubenswrapper[4832]: E1003 00:07:39.386488 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.423384 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.423440 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.423452 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.423468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.423479 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.526333 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.526367 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.526378 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.526394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.526405 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.629169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.629239 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.629257 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.629277 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.629292 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.732618 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.732736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.732748 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.732783 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.732793 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.835400 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.835451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.835463 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.835481 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.835494 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.938129 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.938190 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.938207 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.938231 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:39 crc kubenswrapper[4832]: I1003 00:07:39.938248 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:39Z","lastTransitionTime":"2025-10-03T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.040842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.040918 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.040935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.040966 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.040983 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.144199 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.144257 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.144275 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.144297 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.144313 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.247579 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.247616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.247625 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.247638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.247646 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.350959 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.351066 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.351091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.351119 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.351146 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.454878 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.455000 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.455024 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.455047 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.455066 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.558382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.558436 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.558447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.558464 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.558477 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.661173 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.661243 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.661262 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.661287 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.661311 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.763578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.764007 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.764171 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.764313 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.764441 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.867140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.867377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.867388 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.867405 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.867419 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.970645 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.970687 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.970696 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.970709 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:40 crc kubenswrapper[4832]: I1003 00:07:40.970718 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:40Z","lastTransitionTime":"2025-10-03T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.073914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.073975 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.073991 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.074015 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.074034 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.176554 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.176617 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.176638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.176669 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.176692 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.280049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.280121 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.280138 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.280162 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.280179 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.384495 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.384553 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.384568 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.384587 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.384600 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.385837 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.385920 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.385955 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.386070 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.386155 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.386177 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.386381 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.386571 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.486603 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.486637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.486648 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.486664 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.486675 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.497303 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.497420 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:41 crc kubenswrapper[4832]: E1003 00:07:41.497485 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:07:57.497467907 +0000 UTC m=+70.759623023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.589195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.589248 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.589259 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.589278 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.589290 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.692313 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.692401 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.692420 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.692444 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.692466 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.795700 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.795805 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.795844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.795874 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.795896 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.899311 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.899349 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.899361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.899392 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:41 crc kubenswrapper[4832]: I1003 00:07:41.899402 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:41Z","lastTransitionTime":"2025-10-03T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.004581 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.004623 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.004634 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.004659 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.004682 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.109136 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.109208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.109231 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.109261 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.109284 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.211866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.211916 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.211934 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.211956 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.211973 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.315192 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.315251 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.315260 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.315274 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.315282 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.418361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.418414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.418423 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.418437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.418445 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.521086 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.521149 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.521167 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.521194 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.521215 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.623074 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.623131 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.623141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.623156 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.623175 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.724929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.724966 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.724978 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.724993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.725005 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.827447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.827489 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.827497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.827525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.827535 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.929600 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.929663 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.929672 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.929687 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:42 crc kubenswrapper[4832]: I1003 00:07:42.929696 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:42Z","lastTransitionTime":"2025-10-03T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.032463 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.032521 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.032531 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.032546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.032555 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.111900 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.125155 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.134701 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.134786 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.134803 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.134834 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.134851 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.135866 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.151072 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.166149 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.166186 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.166197 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.166234 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.166247 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.169822 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.186157 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.186330 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.190886 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.190923 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.190934 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.190949 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.190959 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.207712 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.212289 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.212359 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.212381 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.212411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.212437 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.214907 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.226407 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.230595 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.230664 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.230687 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.230743 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.230798 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.231581 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.243770 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.245550 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.249210 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.249269 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.249280 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.249295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.249306 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.257981 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.261915 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.262024 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.263866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.263894 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.263902 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.263915 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.263927 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.272831 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.287589 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.299898 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.313646 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.335069 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.346382 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.359229 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.366743 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.366818 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.366843 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.366865 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.366880 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.374187 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.385668 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.385800 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.385827 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.385668 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.385876 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.385958 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.386055 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:43 crc kubenswrapper[4832]: E1003 00:07:43.386174 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.388862 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:43Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.470018 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.470068 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.470084 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.470538 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.470556 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.573570 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.573623 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.573638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.573660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.573676 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.676412 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.676476 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.676499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.676530 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.676558 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.779242 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.779281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.779291 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.779306 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.779317 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.881966 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.882003 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.882014 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.882029 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.882041 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.984414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.984464 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.984480 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.984503 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:43 crc kubenswrapper[4832]: I1003 00:07:43.984518 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:43Z","lastTransitionTime":"2025-10-03T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.087776 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.087824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.087836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.087851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.087863 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.190998 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.191071 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.191089 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.191115 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.191132 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.294081 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.294148 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.294174 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.294203 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.294225 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.396918 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.396978 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.396997 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.397063 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.397079 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.500374 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.500442 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.500464 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.500528 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.500552 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.602349 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.602415 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.602437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.602461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.602482 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.704682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.704729 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.704739 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.704784 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.704794 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.806549 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.806597 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.806607 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.806621 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.806630 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.908796 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.908863 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.908880 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.908904 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:44 crc kubenswrapper[4832]: I1003 00:07:44.908921 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:44Z","lastTransitionTime":"2025-10-03T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.011347 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.011382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.011390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.011402 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.011413 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.113573 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.113614 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.113622 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.113634 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.113644 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.215895 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.215932 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.215940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.215953 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.215964 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.317700 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.317744 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.317777 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.317794 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.317805 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.385774 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.385780 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:45 crc kubenswrapper[4832]: E1003 00:07:45.385896 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.385950 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.385975 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:45 crc kubenswrapper[4832]: E1003 00:07:45.386032 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:45 crc kubenswrapper[4832]: E1003 00:07:45.386101 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:45 crc kubenswrapper[4832]: E1003 00:07:45.386176 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.420001 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.420049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.420057 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.420068 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.420076 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.522700 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.523062 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.523256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.523483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.523694 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.625711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.625940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.626093 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.626199 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.626278 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.728183 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.728213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.728223 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.728239 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.728249 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.831274 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.831324 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.831353 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.831371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.831382 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.934814 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.934868 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.934877 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.934892 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:45 crc kubenswrapper[4832]: I1003 00:07:45.934904 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:45Z","lastTransitionTime":"2025-10-03T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.038089 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.038449 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.038606 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.038814 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.039002 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.141990 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.142054 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.142077 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.142107 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.142129 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.245145 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.245202 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.245210 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.245226 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.245235 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.347889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.347922 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.347932 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.347944 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.347952 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.450256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.450313 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.450330 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.450353 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.450369 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.553033 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.553079 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.553093 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.553114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.553129 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.655882 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.655929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.655939 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.655955 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.655972 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.758914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.759259 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.759326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.759394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.759473 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.862481 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.862524 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.862537 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.862553 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.862565 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.964637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.964974 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.965061 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.965147 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:46 crc kubenswrapper[4832]: I1003 00:07:46.965205 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:46Z","lastTransitionTime":"2025-10-03T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.068135 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.068210 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.068234 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.068264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.068287 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.170814 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.170861 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.170910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.171053 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.171088 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.273297 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.273564 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.273630 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.273709 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.273824 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.375816 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.375851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.375863 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.375877 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.375888 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.385412 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.385505 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.385588 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.385653 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:47 crc kubenswrapper[4832]: E1003 00:07:47.385668 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:47 crc kubenswrapper[4832]: E1003 00:07:47.385915 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:47 crc kubenswrapper[4832]: E1003 00:07:47.385899 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:47 crc kubenswrapper[4832]: E1003 00:07:47.386035 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.401150 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.418048 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.431860 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.447873 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.465639 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479329 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479407 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479418 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.479919 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.494988 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.506116 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.514170 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.524162 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.532599 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.549734 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.560552 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.571514 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.581307 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.581340 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.581380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.581397 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.581407 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.585318 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.600981 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.610295 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.619685 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:47Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.683692 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.683736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.683747 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.683782 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.683794 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.785793 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.785841 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.785850 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.785865 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.785874 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.888446 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.888660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.888668 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.888681 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.888691 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.991031 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.991082 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.991091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.991106 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:47 crc kubenswrapper[4832]: I1003 00:07:47.991117 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:47Z","lastTransitionTime":"2025-10-03T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.093321 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.093370 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.093378 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.093390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.093398 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.195836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.195896 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.195917 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.195935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.195944 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.298060 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.298108 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.298126 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.298141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.298151 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.386626 4832 scope.go:117] "RemoveContainer" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" Oct 03 00:07:48 crc kubenswrapper[4832]: E1003 00:07:48.386795 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.399657 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.399691 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.399701 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.399713 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.399721 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.502054 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.502103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.502112 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.502127 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.502136 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.604250 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.604290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.604299 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.604312 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.604321 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.706869 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.706915 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.706927 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.706946 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.706964 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.809153 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.809200 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.809211 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.809228 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.809239 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.912376 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.912430 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.912445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.912471 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:48 crc kubenswrapper[4832]: I1003 00:07:48.912485 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:48Z","lastTransitionTime":"2025-10-03T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.015557 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.015632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.015656 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.015684 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.015706 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.120643 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.120727 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.120784 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.120838 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.120864 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.223935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.223992 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.224008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.224030 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.224047 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.326869 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.326907 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.326915 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.326930 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.326939 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.386472 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.386550 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.386511 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.386747 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:49 crc kubenswrapper[4832]: E1003 00:07:49.386702 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:49 crc kubenswrapper[4832]: E1003 00:07:49.386913 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:49 crc kubenswrapper[4832]: E1003 00:07:49.387027 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:49 crc kubenswrapper[4832]: E1003 00:07:49.387161 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.430075 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.430125 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.430140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.430158 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.430170 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.532225 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.532266 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.532278 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.532295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.532307 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.634164 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.634213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.634225 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.634242 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.634252 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.736889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.736969 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.736991 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.737020 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.737042 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.839504 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.839585 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.839602 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.839620 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.839632 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.942086 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.942116 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.942124 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.942136 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:49 crc kubenswrapper[4832]: I1003 00:07:49.942145 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:49Z","lastTransitionTime":"2025-10-03T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.045523 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.045653 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.045677 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.045704 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.045724 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.148469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.148511 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.148521 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.148535 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.148544 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.251408 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.251468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.251485 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.251509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.251527 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.354679 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.354740 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.354824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.354876 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.354899 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.457382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.457414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.457423 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.457438 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.457447 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.559851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.559885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.559895 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.559910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.559920 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.663043 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.663078 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.663088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.663103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.663114 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.765509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.765544 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.765556 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.765571 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.765582 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.867878 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.867951 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.867970 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.868008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.868029 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.970410 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.970459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.970471 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.970489 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:50 crc kubenswrapper[4832]: I1003 00:07:50.970499 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:50Z","lastTransitionTime":"2025-10-03T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.073447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.073483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.073494 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.073510 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.073520 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.176701 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.176801 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.176815 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.176841 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.176855 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.281412 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.281448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.281458 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.281472 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.281481 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.384082 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.384116 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.384124 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.384138 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.384147 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.386289 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:51 crc kubenswrapper[4832]: E1003 00:07:51.386369 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.386479 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:51 crc kubenswrapper[4832]: E1003 00:07:51.386528 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.386633 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:51 crc kubenswrapper[4832]: E1003 00:07:51.386685 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.387016 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:51 crc kubenswrapper[4832]: E1003 00:07:51.387097 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.486159 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.486193 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.486208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.486223 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.486234 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.588450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.588488 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.588499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.588546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.588555 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.690616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.690649 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.690660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.690676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.690687 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.793508 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.793535 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.793542 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.793556 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.793566 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.896293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.896327 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.896338 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.896377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.896389 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.999354 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.999377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.999386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.999398 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:51 crc kubenswrapper[4832]: I1003 00:07:51.999408 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:51Z","lastTransitionTime":"2025-10-03T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.102243 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.102282 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.102290 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.102304 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.102313 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.204525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.204568 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.204576 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.204592 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.204603 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.306633 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.306677 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.306688 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.306722 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.306733 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.409134 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.409169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.409177 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.409191 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.409200 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.511666 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.511711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.511722 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.511739 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.511775 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.613931 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.613967 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.613980 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.613997 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.614007 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.716197 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.716232 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.716243 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.716256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.716265 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.818355 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.818409 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.818428 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.818450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.818466 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.921174 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.921208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.921219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.921235 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:52 crc kubenswrapper[4832]: I1003 00:07:52.921247 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:52Z","lastTransitionTime":"2025-10-03T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.023139 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.023184 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.023197 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.023213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.023224 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.126309 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.126361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.126371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.126384 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.126392 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.229458 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.229492 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.229501 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.229518 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.229527 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.332383 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.332419 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.332428 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.332442 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.332452 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.386262 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.386410 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.386461 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.386472 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.386499 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.386601 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.386531 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.386836 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.435506 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.435576 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.435587 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.435603 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.435618 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.539135 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.539199 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.539211 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.539226 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.539236 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.641776 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.641815 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.641826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.641842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.641852 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.644799 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.644830 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.644842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.644857 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.644867 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.656502 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:53Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.660004 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.660038 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.660051 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.660070 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.660084 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.671289 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:53Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.674799 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.674828 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.674836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.674849 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.674858 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.685355 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:53Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.688569 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.688609 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.688618 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.688632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.688641 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.699674 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:53Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.709225 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.709263 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.709273 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.709289 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.709302 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.720271 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:53Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:53 crc kubenswrapper[4832]: E1003 00:07:53.720383 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.744965 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.745023 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.745034 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.745053 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.745065 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.847947 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.847988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.847998 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.848012 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.848023 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.951068 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.951108 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.951118 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.951134 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:53 crc kubenswrapper[4832]: I1003 00:07:53.951142 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:53Z","lastTransitionTime":"2025-10-03T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.053523 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.053566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.053578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.053620 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.053634 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.157323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.157371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.157386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.157409 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.157424 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.261605 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.261668 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.261693 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.261729 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.261794 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.364271 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.364353 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.364371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.364402 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.364423 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.469049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.469104 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.469115 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.469133 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.469145 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.572459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.572501 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.572512 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.572528 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.572538 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.674843 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.674894 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.674903 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.674916 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.674927 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.777596 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.777633 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.777645 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.777661 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.777671 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.879799 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.879845 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.879875 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.879893 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.879921 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.982787 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.982817 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.982826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.982839 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:54 crc kubenswrapper[4832]: I1003 00:07:54.982849 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:54Z","lastTransitionTime":"2025-10-03T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.085302 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.085354 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.085363 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.085379 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.085389 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.187408 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.187450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.187461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.187476 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.187484 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.290717 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.290857 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.290890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.290929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.290953 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.386258 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.386330 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:55 crc kubenswrapper[4832]: E1003 00:07:55.386394 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.386330 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.386521 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:55 crc kubenswrapper[4832]: E1003 00:07:55.387257 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:55 crc kubenswrapper[4832]: E1003 00:07:55.387421 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:55 crc kubenswrapper[4832]: E1003 00:07:55.387534 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.392447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.392492 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.392507 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.392525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.392539 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.495791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.495838 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.495849 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.495873 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.495885 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.598541 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.598599 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.598615 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.598634 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.598649 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.701766 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.701829 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.701840 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.701861 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.701875 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.804979 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.805049 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.805072 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.805095 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.805117 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.907568 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.907613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.907624 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.907642 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:55 crc kubenswrapper[4832]: I1003 00:07:55.907656 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:55Z","lastTransitionTime":"2025-10-03T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.010939 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.011017 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.011040 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.011072 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.011091 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.113706 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.113791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.113803 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.113820 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.113832 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.216457 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.216505 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.216517 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.216532 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.216544 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.319504 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.319540 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.319550 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.319566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.319577 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.421907 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.421942 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.421972 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.421983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.421993 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.524631 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.524678 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.524687 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.524703 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.524713 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.626945 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.626997 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.627009 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.627025 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.627036 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.728834 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.728885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.728897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.728914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.728924 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.831230 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.831270 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.831278 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.831295 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.831305 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.933446 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.933489 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.933500 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.933517 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:56 crc kubenswrapper[4832]: I1003 00:07:56.933528 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:56Z","lastTransitionTime":"2025-10-03T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.036017 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.036095 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.036110 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.036135 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.036151 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.139430 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.139509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.139531 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.139560 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.139582 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.242164 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.242199 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.242207 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.242219 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.242227 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.344483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.344551 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.344573 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.344607 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.344630 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.386506 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.386695 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.386905 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.386890 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.386948 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.387101 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.387134 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.387179 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.403418 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.417886 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.440424 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.447117 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.447149 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.447160 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.447176 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.447188 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.454199 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.470798 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.484799 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.498732 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.511650 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.534470 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.547852 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.549836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.549867 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.549881 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.549897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.549907 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.559962 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.571191 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.571322 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:57 crc kubenswrapper[4832]: E1003 00:07:57.571386 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:08:29.571369064 +0000 UTC m=+102.833524170 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.573138 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.587144 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.599749 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.612055 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.625795 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.643124 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.652293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.652330 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.652340 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.652355 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.652365 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.657036 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:57Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.754508 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.754560 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.754572 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.754592 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.754607 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.856919 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.856986 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.856998 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.857011 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.857021 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.959539 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.959590 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.959611 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.959638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:57 crc kubenswrapper[4832]: I1003 00:07:57.959657 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:57Z","lastTransitionTime":"2025-10-03T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.061711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.061819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.061848 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.061880 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.061903 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.164596 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.164633 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.164660 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.164674 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.164684 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.267156 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.267206 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.267220 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.267235 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.267244 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.369329 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.369373 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.369386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.369402 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.369413 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.471395 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.471441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.471450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.471468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.471477 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.573544 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.573581 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.573589 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.573602 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.573613 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.676378 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.676439 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.676450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.676465 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.676477 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.778606 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.778644 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.778653 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.778667 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.778693 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.814804 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/0.log" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.814857 4832 generic.go:334] "Generic (PLEG): container finished" podID="96bf1ee1-9249-4db9-83da-17eb15aa5f9a" containerID="778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369" exitCode=1 Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.814888 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerDied","Data":"778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.815249 4832 scope.go:117] "RemoveContainer" containerID="778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.830270 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.847549 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.860163 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.873945 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.880867 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.880919 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.880937 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.880963 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.880980 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.884061 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.897719 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.909345 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.928696 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.944504 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.955515 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.967351 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.982806 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.982832 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.982842 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.982856 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.982865 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:58Z","lastTransitionTime":"2025-10-03T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.984685 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:58 crc kubenswrapper[4832]: I1003 00:07:58.996198 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:58Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.007457 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.018462 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.028480 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.040516 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.049273 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.085369 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.085428 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.085437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.085451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.085461 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.187416 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.187445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.187454 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.187466 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.187475 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.290433 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.290493 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.290515 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.290543 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.290564 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.388017 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.388078 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.388038 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:07:59 crc kubenswrapper[4832]: E1003 00:07:59.388236 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:07:59 crc kubenswrapper[4832]: E1003 00:07:59.388292 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:07:59 crc kubenswrapper[4832]: E1003 00:07:59.388400 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.388667 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:07:59 crc kubenswrapper[4832]: E1003 00:07:59.388776 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.393182 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.393213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.393222 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.393234 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.393244 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.495470 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.495510 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.495518 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.495532 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.495541 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.598078 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.598150 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.598173 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.598202 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.598224 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.700623 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.700684 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.700706 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.700815 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.700852 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.803615 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.803653 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.803665 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.803680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.803690 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.819305 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/0.log" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.819363 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerStarted","Data":"d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.832132 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.847518 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.857767 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.874060 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.886245 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.904583 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.905591 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.905645 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.905659 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.905676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.905688 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:07:59Z","lastTransitionTime":"2025-10-03T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.916394 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.927127 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.940207 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.957640 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.968627 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:07:59 crc kubenswrapper[4832]: I1003 00:07:59.978946 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:07:59Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.004902 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.008990 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.009039 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.009052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.009070 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.009081 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.016520 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.030080 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.041908 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.054633 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.065608 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:00Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.111394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.111432 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.111440 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.111453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.111464 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.213705 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.213783 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.213795 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.213811 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.213823 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.316113 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.316163 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.316177 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.316195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.316208 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.419130 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.419166 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.419175 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.419188 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.419196 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.521007 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.521056 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.521070 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.521088 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.521100 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.623853 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.623890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.623899 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.623913 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.623923 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.725403 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.725428 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.725436 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.725449 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.725458 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.827612 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.828028 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.828137 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.828233 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.828318 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.930518 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.930552 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.930563 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.930578 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:00 crc kubenswrapper[4832]: I1003 00:08:00.930588 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:00Z","lastTransitionTime":"2025-10-03T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.033581 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.033916 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.034024 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.034287 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.034392 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.136035 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.136080 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.136091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.136108 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.136121 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.239156 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.239393 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.239521 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.239595 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.239700 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.342153 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.342196 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.342208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.342224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.342234 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.385659 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.385669 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.385679 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.385709 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:01 crc kubenswrapper[4832]: E1003 00:08:01.386178 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:01 crc kubenswrapper[4832]: E1003 00:08:01.386177 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:01 crc kubenswrapper[4832]: E1003 00:08:01.386391 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:01 crc kubenswrapper[4832]: E1003 00:08:01.386471 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.444866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.445123 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.445246 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.445336 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.445423 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.549021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.549263 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.549332 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.549413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.549472 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.652149 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.652386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.652447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.652506 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.652571 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.754634 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.754667 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.754678 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.754692 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.754703 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.856879 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.856922 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.856939 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.856956 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.856967 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.958375 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.958411 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.958431 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.958448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:01 crc kubenswrapper[4832]: I1003 00:08:01.958459 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:01Z","lastTransitionTime":"2025-10-03T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.060525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.060563 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.060572 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.060585 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.060595 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.162631 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.162672 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.162689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.162712 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.162730 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.265284 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.265338 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.265359 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.265390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.265412 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.368275 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.368331 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.368343 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.368363 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.368376 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.386959 4832 scope.go:117] "RemoveContainer" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.471091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.471137 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.471153 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.471171 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.471186 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.573466 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.573499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.573510 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.573525 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.573537 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.676530 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.676585 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.676599 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.676617 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.676631 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.778808 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.778845 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.778855 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.778871 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.778880 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.828651 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/2.log" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.831626 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.832158 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.847060 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.875914 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.881058 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.881089 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.881096 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.881110 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.881118 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.889027 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.902420 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.913336 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.928102 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.936784 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.946843 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.959100 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.972816 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.983380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.983419 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.983431 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.983450 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.983463 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:02Z","lastTransitionTime":"2025-10-03T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.985409 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:02 crc kubenswrapper[4832]: I1003 00:08:02.994136 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:02Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.011579 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.024203 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.037274 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.048328 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.059365 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.068942 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.085636 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.085673 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.085680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.085693 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.085702 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.187797 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.187837 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.187846 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.187861 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.187870 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.289639 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.289684 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.289695 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.289711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.289724 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.386105 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.386140 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.386152 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.386223 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.386369 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.386365 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.386622 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.386717 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.391957 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.391995 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.392004 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.392016 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.392027 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.495011 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.495043 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.495052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.495064 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.495075 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.597405 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.597440 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.597451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.597467 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.597477 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.699809 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.699854 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.699865 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.699882 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.699894 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.802046 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.802074 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.802082 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.802095 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.802104 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.835841 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/3.log" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.836937 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/2.log" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.839655 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" exitCode=1 Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.839700 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.839748 4832 scope.go:117] "RemoveContainer" containerID="4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.840712 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.840998 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.862730 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.874716 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.882638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.882684 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.882702 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.882724 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.882740 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.888941 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.895187 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.899286 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.899323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.899337 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.899357 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.899372 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.902049 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.914869 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.914976 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.919584 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.919613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.919624 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.919642 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.919655 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.933568 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.937233 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.937271 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.937280 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.937297 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.937306 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.938273 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.949419 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.951171 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.954091 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.954112 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.954142 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.954156 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.954164 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.963673 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.965806 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: E1003 00:08:03.965970 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.972319 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.972379 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.972394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.972414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.972428 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:03Z","lastTransitionTime":"2025-10-03T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.979007 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:03 crc kubenswrapper[4832]: I1003 00:08:03.993414 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:03Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.004813 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.014264 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.030962 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d07c7c33d5db217414d17af04896ed7415afa2e32a6d9686a330231dd0d0a0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:36Z\\\",\\\"message\\\":\\\"-crc openshift-multus/multus-additional-cni-plugins-zdfsz openshift-image-registry/node-ca-76ck7 openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1003 00:07:36.207478 6499 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:07:36.207478 6499 lb_config.go:1031] Cluster endpoints for openshift-network-console/networking-console-plugin for network=default are: map[]\\\\nF1003 00:07:36.207491 6499 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certific\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"image-registry/node-ca-76ck7 openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-multus/multus-additional-cni-plugins-zdfsz openshift-network-operator/iptables-alerter-4ln5h openshift-etcd/etcd-crc openshift-network-diagnostics/network-check-target-xd92c openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-rwqt7 openshift-multus/network-metrics-daemon-ppthn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-node-zc9xl]\\\\nI1003 00:08:03.233688 6867 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:08:03.233703 6867 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zc9xl\\\\nF1003 00:08:03.233707 6867 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.042208 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.052658 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.062916 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074266 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074307 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074318 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074334 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074345 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.074849 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.085565 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.176301 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.176374 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.176383 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.176418 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.176428 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.278441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.278499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.278513 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.278533 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.278547 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.380809 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.380847 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.380860 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.380876 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.380887 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.483281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.483327 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.483342 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.483359 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.483375 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.586004 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.586038 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.586048 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.586062 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.586071 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.688429 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.688465 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.688473 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.688488 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.688497 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.790603 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.790646 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.790657 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.790675 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.790687 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.843916 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/3.log" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.846902 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:08:04 crc kubenswrapper[4832]: E1003 00:08:04.847114 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.858335 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.870658 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.881894 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.893566 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.893607 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.893637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.893691 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.893733 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.898654 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.915111 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"image-registry/node-ca-76ck7 openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-multus/multus-additional-cni-plugins-zdfsz openshift-network-operator/iptables-alerter-4ln5h openshift-etcd/etcd-crc openshift-network-diagnostics/network-check-target-xd92c openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-rwqt7 openshift-multus/network-metrics-daemon-ppthn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-node-zc9xl]\\\\nI1003 00:08:03.233688 6867 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:08:03.233703 6867 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zc9xl\\\\nF1003 00:08:03.233707 6867 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.927177 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.938446 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.952637 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.963457 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.973873 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.986200 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.995472 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.995531 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.995541 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.995554 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.995581 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:04Z","lastTransitionTime":"2025-10-03T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:04 crc kubenswrapper[4832]: I1003 00:08:04.998622 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:04Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.008338 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.017265 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.033357 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.043614 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.054318 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.066548 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:05Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.097949 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.097977 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.097988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.098005 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.098017 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.200009 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.200052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.200063 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.200078 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.200089 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.302041 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.302294 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.302380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.302456 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.302515 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.385648 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.385698 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.385698 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.385780 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:05 crc kubenswrapper[4832]: E1003 00:08:05.386207 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:05 crc kubenswrapper[4832]: E1003 00:08:05.386098 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:05 crc kubenswrapper[4832]: E1003 00:08:05.386377 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:05 crc kubenswrapper[4832]: E1003 00:08:05.386542 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.404676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.404722 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.404733 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.404746 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.404783 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.507646 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.507689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.507700 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.507719 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.507729 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.611063 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.611103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.611114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.611128 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.611136 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.713699 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.713980 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.714050 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.714122 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.714192 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.816465 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.816711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.816806 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.816877 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.816937 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.919284 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.919321 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.919332 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.919350 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:05 crc kubenswrapper[4832]: I1003 00:08:05.919361 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:05Z","lastTransitionTime":"2025-10-03T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.022064 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.022099 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.022107 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.022120 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.022130 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.125426 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.125470 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.125479 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.125494 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.125503 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.228098 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.228133 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.228141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.228154 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.228165 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.329935 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.329977 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.329986 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.330000 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.330009 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.432627 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.432653 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.432661 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.432673 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.432682 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.534903 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.534930 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.534938 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.534949 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.534959 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.637270 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.637299 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.637306 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.637319 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.637329 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.739161 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.739204 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.739216 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.739231 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.739245 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.841213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.841273 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.841285 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.841302 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.841313 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.943379 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.943434 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.943451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.943477 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:06 crc kubenswrapper[4832]: I1003 00:08:06.943512 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:06Z","lastTransitionTime":"2025-10-03T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.045768 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.045819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.045833 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.045851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.045864 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.148415 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.148459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.148473 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.148511 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.148528 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.250666 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.250707 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.250719 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.250736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.250749 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.353602 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.353642 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.353801 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.353835 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.353850 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.386446 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.386488 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:07 crc kubenswrapper[4832]: E1003 00:08:07.386608 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.386640 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.386670 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:07 crc kubenswrapper[4832]: E1003 00:08:07.386793 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:07 crc kubenswrapper[4832]: E1003 00:08:07.386870 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:07 crc kubenswrapper[4832]: E1003 00:08:07.386931 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.399857 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.411838 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.426847 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.438957 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.450427 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.457293 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.457319 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.457335 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.457348 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.457356 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.468286 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.480919 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.492223 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.503862 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.514580 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.524228 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.534493 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.559216 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.559246 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.559254 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.559268 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.559276 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.564255 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.589100 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.603222 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.610580 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.626724 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"image-registry/node-ca-76ck7 openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-multus/multus-additional-cni-plugins-zdfsz openshift-network-operator/iptables-alerter-4ln5h openshift-etcd/etcd-crc openshift-network-diagnostics/network-check-target-xd92c openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-rwqt7 openshift-multus/network-metrics-daemon-ppthn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-node-zc9xl]\\\\nI1003 00:08:03.233688 6867 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:08:03.233703 6867 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zc9xl\\\\nF1003 00:08:03.233707 6867 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.639360 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:07Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.660949 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.660973 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.660981 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.660993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.661001 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.762889 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.763233 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.763245 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.763263 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.763276 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.865521 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.865637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.865657 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.865678 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.865694 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.967652 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.967689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.967704 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.967721 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:07 crc kubenswrapper[4832]: I1003 00:08:07.967733 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:07Z","lastTransitionTime":"2025-10-03T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.069789 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.069838 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.069852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.069870 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.069881 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.171902 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.171952 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.171961 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.171975 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.171986 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.274358 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.274406 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.274417 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.274434 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.274445 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.377163 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.377228 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.377247 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.377266 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.377280 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.479680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.479742 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.479784 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.479819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.479840 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.586322 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.586361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.586371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.586386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.586395 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.688416 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.688453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.688462 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.688476 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.688485 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.790983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.791042 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.791052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.791068 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.791081 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.893351 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.893386 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.893394 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.893406 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.893415 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.994899 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.994941 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.994953 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.994970 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:08 crc kubenswrapper[4832]: I1003 00:08:08.994982 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:08Z","lastTransitionTime":"2025-10-03T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.097206 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.097246 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.097256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.097268 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.097276 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.200045 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.200094 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.200106 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.200125 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.200137 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.302391 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.302432 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.302444 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.302460 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.302472 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.386511 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.386544 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.386544 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.386683 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.386734 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.386842 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.387020 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.387172 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.404661 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.404749 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.404801 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.404853 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.404871 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.507962 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.508035 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.508057 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.508087 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.508112 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.610716 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.610806 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.610826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.610850 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.610867 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.712570 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.712596 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.712604 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.712617 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.712626 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.815543 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.815584 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.815591 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.815607 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.815616 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.918130 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.918160 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.918169 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.918180 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.918189 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:09Z","lastTransitionTime":"2025-10-03T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.983963 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984122 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.984098554 +0000 UTC m=+147.246253660 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.984167 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.984226 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.984258 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:09 crc kubenswrapper[4832]: I1003 00:08:09.984274 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984351 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984365 4832 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984372 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984386 4832 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984404 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.984395274 +0000 UTC m=+147.246550380 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984438 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.984419745 +0000 UTC m=+147.246575041 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984433 4832 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984526 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984561 4832 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984582 4832 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984585 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.984538099 +0000 UTC m=+147.246693205 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 00:08:09 crc kubenswrapper[4832]: E1003 00:08:09.984653 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.984635033 +0000 UTC m=+147.246790339 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.020535 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.020583 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.020595 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.020613 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.020626 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.123622 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.123672 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.123689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.123714 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.123728 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.226654 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.226716 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.226725 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.226740 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.226783 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.329885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.329942 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.329961 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.329988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.330008 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.432240 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.432286 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.432303 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.432324 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.432341 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.534610 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.534665 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.534676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.534689 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.534697 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.637631 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.637674 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.637684 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.637703 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.637714 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.740264 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.740325 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.740344 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.740370 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.740387 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.843018 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.843048 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.843060 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.843077 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.843089 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.945254 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.945289 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.945297 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.945310 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:10 crc kubenswrapper[4832]: I1003 00:08:10.945319 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:10Z","lastTransitionTime":"2025-10-03T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.048436 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.048518 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.048538 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.048563 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.048582 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.152300 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.152382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.152400 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.152437 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.152456 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.255834 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.255885 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.255900 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.255919 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.255933 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.359965 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.360047 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.360092 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.360127 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.360155 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.385634 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.385734 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.385824 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.385648 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:11 crc kubenswrapper[4832]: E1003 00:08:11.385782 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:11 crc kubenswrapper[4832]: E1003 00:08:11.385987 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:11 crc kubenswrapper[4832]: E1003 00:08:11.386130 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:11 crc kubenswrapper[4832]: E1003 00:08:11.386297 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.463446 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.463495 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.463509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.463534 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.463549 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.566819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.566851 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.566862 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.566875 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.566884 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.669615 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.669656 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.669668 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.669685 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.669697 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.772607 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.772649 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.772662 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.772680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.772691 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.875235 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.875309 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.875330 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.875358 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.875382 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.977954 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.977996 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.978009 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.978024 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:11 crc kubenswrapper[4832]: I1003 00:08:11.978036 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:11Z","lastTransitionTime":"2025-10-03T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.080951 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.081007 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.081018 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.081037 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.081048 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.184621 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.184682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.184699 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.184722 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.184738 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.287435 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.287488 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.287508 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.287532 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.287548 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.390124 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.390183 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.390201 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.390224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.390246 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.493107 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.493170 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.493187 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.493211 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.493228 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.596380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.596429 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.596441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.596459 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.596469 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.699423 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.699466 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.699480 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.699499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.699513 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.801924 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.801971 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.801982 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.802000 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.802011 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.905074 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.905132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.905156 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.905177 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:12 crc kubenswrapper[4832]: I1003 00:08:12.905191 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:12Z","lastTransitionTime":"2025-10-03T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.008300 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.008346 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.008361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.008383 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.008399 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.110331 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.110375 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.110390 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.110413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.110427 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.212261 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.212354 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.212376 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.212724 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.213018 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.315657 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.315709 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.315726 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.315784 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.315803 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.385448 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.385472 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.385472 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:13 crc kubenswrapper[4832]: E1003 00:08:13.385591 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.385679 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:13 crc kubenswrapper[4832]: E1003 00:08:13.385854 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:13 crc kubenswrapper[4832]: E1003 00:08:13.385915 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:13 crc kubenswrapper[4832]: E1003 00:08:13.386025 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.417352 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.417392 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.417400 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.417413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.417422 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.519447 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.519483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.519493 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.519507 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.519516 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.621816 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.621867 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.621881 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.621899 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.621911 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.723866 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.723912 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.723923 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.723940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.723953 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.825598 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.825652 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.825668 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.825692 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.825708 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.932430 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.932815 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.932827 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.932847 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:13 crc kubenswrapper[4832]: I1003 00:08:13.932859 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:13Z","lastTransitionTime":"2025-10-03T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.036136 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.036200 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.036218 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.036245 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.036267 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.139499 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.139546 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.139556 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.139570 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.139579 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.242317 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.242364 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.242377 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.242391 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.242402 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.293276 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.293343 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.293362 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.293388 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.293411 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.313857 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.318724 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.318778 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.318789 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.318804 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.318813 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.332112 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.337676 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.337744 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.337794 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.337823 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.337866 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.353432 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.358205 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.358255 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.358267 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.358285 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.358300 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.371005 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.375852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.375929 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.375952 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.375981 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.376003 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.397043 4832 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T00:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1e5f816a-2677-4a9b-8b73-ae876a2ba5f2\\\",\\\"systemUUID\\\":\\\"22c336a9-c598-416a-a85e-c54be0a6a3c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:14Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:14 crc kubenswrapper[4832]: E1003 00:08:14.397366 4832 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.399311 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.399369 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.399389 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.399417 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.399439 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.502228 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.502334 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.502348 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.502373 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.502388 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.605586 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.605690 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.605704 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.605728 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.605742 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.708497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.708542 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.708551 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.708570 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.708582 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.812690 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.812736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.812746 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.812774 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.812785 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.915736 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.915878 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.915897 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.915926 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:14 crc kubenswrapper[4832]: I1003 00:08:14.915954 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:14Z","lastTransitionTime":"2025-10-03T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.018509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.018682 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.018706 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.018803 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.018829 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.122111 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.122656 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.122671 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.122695 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.122711 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.226583 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.226668 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.226686 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.226719 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.226745 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.330717 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.330802 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.330819 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.330841 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.330858 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.386581 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.386677 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.386677 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:15 crc kubenswrapper[4832]: E1003 00:08:15.386900 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:15 crc kubenswrapper[4832]: E1003 00:08:15.387040 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.387095 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:15 crc kubenswrapper[4832]: E1003 00:08:15.387180 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:15 crc kubenswrapper[4832]: E1003 00:08:15.387288 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.434582 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.434638 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.434650 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.434674 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.434688 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.538572 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.539042 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.539132 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.539248 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.539345 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.642882 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.642984 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.643010 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.643040 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.643060 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.746242 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.746326 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.746350 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.746383 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.746407 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.850296 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.850370 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.850389 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.850418 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.850439 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.953490 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.953577 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.953603 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.953631 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:15 crc kubenswrapper[4832]: I1003 00:08:15.953650 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:15Z","lastTransitionTime":"2025-10-03T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.057337 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.057414 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.057429 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.057456 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.057474 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.160826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.160893 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.160911 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.160936 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.160956 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.264011 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.264166 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.264191 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.264224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.264247 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.367830 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.367880 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.367890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.367910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.367920 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.387704 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:08:16 crc kubenswrapper[4832]: E1003 00:08:16.388196 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.470451 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.470523 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.470542 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.470573 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.470596 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.573872 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.573937 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.573956 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.573984 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.574005 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.676547 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.676594 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.676616 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.676642 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.676661 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.779535 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.779619 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.779637 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.779666 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.779686 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.883217 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.883307 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.883336 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.883371 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.883400 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.986878 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.986917 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.986927 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.986944 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:16 crc kubenswrapper[4832]: I1003 00:08:16.986955 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:16Z","lastTransitionTime":"2025-10-03T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.088947 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.088996 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.089006 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.089021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.089035 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.192468 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.192534 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.192556 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.192586 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.192608 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.295372 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.295426 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.295441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.295461 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.295477 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.386117 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.386194 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:17 crc kubenswrapper[4832]: E1003 00:08:17.386253 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.386276 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.386136 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:17 crc kubenswrapper[4832]: E1003 00:08:17.386441 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:17 crc kubenswrapper[4832]: E1003 00:08:17.386664 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:17 crc kubenswrapper[4832]: E1003 00:08:17.386797 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.398016 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.398052 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.398065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.398080 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.398093 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.405861 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a6b04d90ea1fd4356f0a54594a83fd6029be216c6263654ad4916f2c3b7f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.417876 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b9c9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4369beec-ad3f-4de9-909b-7d0f8c76aad3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898c5e76b7f0bbad018b2aa9db1e27240ccbd9d5b66854f548aa6d664abc8799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvpm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b9c9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.437371 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f189ec1-8169-4886-bff6-f67f7958050e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:08:03Z\\\",\\\"message\\\":\\\"image-registry/node-ca-76ck7 openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-xzpk4 openshift-multus/multus-additional-cni-plugins-zdfsz openshift-network-operator/iptables-alerter-4ln5h openshift-etcd/etcd-crc openshift-network-diagnostics/network-check-target-xd92c openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-rwqt7 openshift-multus/network-metrics-daemon-ppthn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-node-zc9xl]\\\\nI1003 00:08:03.233688 6867 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 00:08:03.233703 6867 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zc9xl\\\\nF1003 00:08:03.233707 6867 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:08:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nlv87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zc9xl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.451922 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad7ffe6-f721-4c38-8acd-2d4de59b527b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da0287ba03b221e49a8840f93c53df8a865b9216a4e93f4a0fc4c62d79227004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaa07dca4d8aba0a6439a43af7397b5d9ef4d5a398f5c03ac93f3f41fb80ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef77e4c71ea4d3dd7aa61afb0787bf88352730ae638cfefe1f9f9f033208692a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0636ac12ebbda7e4ee8bbe4f5be5454ca7a879c92c8f444466a34c387668398c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af8925d64d9d95117625cfca58e2f0c90128ce63c82370e3a505d48fce14e474\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 00:07:06.003008 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 00:07:06.003232 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 00:07:06.007835 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-713505854/tls.crt::/tmp/serving-cert-713505854/tls.key\\\\\\\"\\\\nI1003 00:07:06.247728 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 00:07:06.253351 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 00:07:06.253377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 00:07:06.253665 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 00:07:06.253677 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 00:07:06.261087 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 00:07:06.261596 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 00:07:06.261606 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 00:07:06.261610 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 00:07:06.261613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 00:07:06.261616 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 00:07:06.261905 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 00:07:06.264368 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc6fca5c4906594dc721c77dba62a306d840e88ed949a9a579a695bdc1f1c011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://359b3f91c772c18a52ed1e5985275405b990187a860cf7a7f62c2f602f54c77e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.465106 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a530701-412d-4b02-b532-de7993881992\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2b0553c8f774a096bc1446325436a1a73a24f67b3512462ce0b5029092f30c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0451961c80d1affd1cc6903d8c4ccba7de47c25255826d49ee312958f7a73b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e57b2ff533ebe739ef6e23dcd8d6f5ef85143046db77ed3a95ffb87f771152\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://703d4dc13279deda56e8154ab7238b507e78852fd477db2bbe6e3e19aa0fbd37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.476314 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75a3f03f-3483-4611-9098-1bc8f776bf72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2179d1da37c6a7dc24550fceca8e5f629ced4a0bfc34d8e0439be45b1a1128b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a5815e928eca00e1782c57d1b18e6fb2f7db287f404ea3c83323efe2db13752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c798af55ddd694a06f5faa8f350941ad89970fa629efb67ad4f0ed08608192c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://990d28547daf6496b88f4cbe2716fce600053a9cbff1effd1a7bcf555de9f990\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.491001 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.500836 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.500993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.501020 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.501093 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.501125 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.505302 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13eeb165-19f7-4e3f-a442-1c9c570d853b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://196d3468b2a4ea6792c19f85b28df1b5df5e612428d986d2affed70158ba92f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfnk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xzpk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.516514 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7262a85e-7e44-4422-b1d2-5d045503e06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afd2b89ee54380c73b04bde347dce4d74a847e4ba94fc7aeabd5ac24e4dfd4b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e95e621cb9e7d0c04b426440cf746eeccf68892215e1ed9e963935b9fdba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwhr8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4j5bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.527550 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ppthn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ppthn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.545917 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c3e0e9-c2a7-4c60-b523-9972df4adad5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1688f7425bdf5630147752e6d2c0dc58351b86d087c39bc6c4d73ffbf11949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f4161badca9a8087ab95b77ede5bc72d0c44b3f7f0223f2073354e76d6070d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb1fa15d84200378239e8a064cb83aa401750e92073f9d29d94858e65add776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://342b151ec083dc566428af857e4013b70d34fbf70632147d13f5e251ef3460c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f39a6c9881b2f1d620a85a3c6887d82b81d18ecf599c66b967d5580689537314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a180babeb6469b2158474f74047864c74117a5c272be9437712bc1729563108b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b12fcbe54af7073178320f8a130e78196b2d1d3a055c4d8410e4cc1da2d4686e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5e7853d0c8e96873c687c75d7226d8c11ed0ab5f4f2e1e975423de06b7c8c10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.557298 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.571864 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4175afbbdf7905f2c69f608e1cd85fdfae2818582d88e9a6d46c20ba0cede788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.582912 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6753f7eb0f3ff7e4c94125513be77026242079d658b7139e86701a2b04ba5d1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://823db4c8596fa85759a3d818b0282b8b8b2c4c2e948160accbd306ef09e38060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.593723 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rwqt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96bf1ee1-9249-4db9-83da-17eb15aa5f9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T00:07:58Z\\\",\\\"message\\\":\\\"2025-10-03T00:07:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446\\\\n2025-10-03T00:07:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_849ee2bf-aa0e-479e-b017-cbd23c73d446 to /host/opt/cni/bin/\\\\n2025-10-03T00:07:13Z [verbose] multus-daemon started\\\\n2025-10-03T00:07:13Z [verbose] Readiness Indicator file check\\\\n2025-10-03T00:07:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm92t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rwqt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.604143 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.604208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.604218 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.604232 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.604241 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.605184 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76ck7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b962b7f-e442-4e25-97f4-21f16b9f1324\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5df7a035e3c381c541f08eec6fffe846e7002a534b4c6ae1316a15cf74e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksm2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76ck7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.615375 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.627371 4832 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66a6f849-6870-4e28-8ca4-c849c22ad252\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e3e718893a18a23dee11b28bb7285f26501527c6aa136acf2b8f7e0d7494c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f49f638397d6dffb41b5774d362b7441d3191f290149de6ecef5c38a6e8a7a81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d52a70ad1026d8eaaeed9dbb581eb90dbc962f036711e2f68085ac437c46b1fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddd5a446bcc39d02bbaf8ae1b9a92b550b975c4280cea13a517076261e177bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f907c9873e2f8cbedebcc3116624dbaf2e6b8088d9f0ef52a5878e1515e73720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d991977ea5d7140ddc09fecfc872067a05a7773c0066c94ec88c603b6f79f32e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7711e37ec666f0285f64eac11397ea7747d74997deaba1e9cc141218d28fbaa7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gf9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T00:07:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zdfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T00:08:17Z is after 2025-08-24T17:21:41Z" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.710942 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.710983 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.710993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.711008 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.711017 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.813030 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.813070 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.813083 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.813104 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.813119 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.915559 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.915601 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.915614 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.915632 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:17 crc kubenswrapper[4832]: I1003 00:08:17.915645 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:17Z","lastTransitionTime":"2025-10-03T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.018791 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.018872 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.018892 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.018914 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.018931 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.121743 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.121814 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.121826 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.121844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.121859 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.224453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.224504 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.224516 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.224533 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.224546 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.326798 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.326833 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.326844 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.326860 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.326871 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.399901 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.430173 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.430232 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.430256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.430284 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.430305 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.533260 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.533323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.533346 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.533374 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.533396 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.635322 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.635382 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.635404 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.635431 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.635453 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.737874 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.737924 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.737948 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.737967 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.737981 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.840445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.840479 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.840494 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.840513 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.840527 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.943370 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.943425 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.943445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.943467 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:18 crc kubenswrapper[4832]: I1003 00:08:18.943484 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:18Z","lastTransitionTime":"2025-10-03T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.045621 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.045670 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.045688 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.045711 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.045728 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.148167 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.148202 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.148213 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.148227 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.148238 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.250859 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.250940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.250955 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.250972 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.250986 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.353210 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.353252 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.353263 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.353281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.353292 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.385701 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.385895 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.385991 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.386083 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:19 crc kubenswrapper[4832]: E1003 00:08:19.386320 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:19 crc kubenswrapper[4832]: E1003 00:08:19.386411 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:19 crc kubenswrapper[4832]: E1003 00:08:19.386230 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:19 crc kubenswrapper[4832]: E1003 00:08:19.386590 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.456064 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.456114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.456124 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.456139 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.456150 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.558923 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.558989 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.559005 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.559028 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.559045 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.662114 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.662153 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.662163 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.662182 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.662192 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.766388 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.766509 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.766538 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.766568 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.766588 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.870960 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.871021 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.871041 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.871065 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.871083 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.984456 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.984667 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.984686 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.984712 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:19 crc kubenswrapper[4832]: I1003 00:08:19.984723 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:19Z","lastTransitionTime":"2025-10-03T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.088627 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.088697 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.088713 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.088733 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.088770 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.191779 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.191855 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.191870 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.191890 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.191904 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.294155 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.294220 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.294230 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.294244 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.294254 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.397372 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.397413 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.397424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.397441 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.397451 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.500373 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.500433 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.500454 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.500481 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.500503 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.603360 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.603396 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.603408 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.603426 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.603439 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.705313 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.705350 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.705363 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.705380 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.705391 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.807587 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.807646 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.807669 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.807698 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.807720 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.909975 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.910019 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.910031 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.910050 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:20 crc kubenswrapper[4832]: I1003 00:08:20.910067 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:20Z","lastTransitionTime":"2025-10-03T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.012374 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.012470 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.012483 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.012497 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.012507 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.114960 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.115020 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.115037 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.115063 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.115082 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.217180 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.217224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.217236 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.217256 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.217269 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.319950 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.319993 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.320009 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.320031 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.320045 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.385567 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.385681 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:21 crc kubenswrapper[4832]: E1003 00:08:21.385802 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.385851 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.385946 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:21 crc kubenswrapper[4832]: E1003 00:08:21.386107 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:21 crc kubenswrapper[4832]: E1003 00:08:21.386289 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:21 crc kubenswrapper[4832]: E1003 00:08:21.386528 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.422536 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.422573 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.422584 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.422604 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.422621 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.524988 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.525026 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.525035 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.525048 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.525057 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.626720 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.626795 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.626804 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.626816 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.626824 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.728824 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.728876 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.728891 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.728910 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.728921 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.831142 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.831189 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.831201 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.831217 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.831230 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.933781 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.933854 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.933877 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.933926 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:21 crc kubenswrapper[4832]: I1003 00:08:21.933944 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:21Z","lastTransitionTime":"2025-10-03T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.036334 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.036367 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.036378 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.036393 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.036402 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.140561 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.140627 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.140640 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.140658 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.140672 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.243141 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.243178 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.243186 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.243200 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.243208 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.345976 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.346018 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.346027 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.346040 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.346050 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.449040 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.449090 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.449103 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.449128 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.449142 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.551534 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.551589 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.551617 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.551639 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.551653 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.653703 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.653746 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.653780 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.653796 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.653810 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.755918 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.755965 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.755977 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.755992 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.756003 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.859393 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.859444 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.859453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.859469 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.859484 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.962453 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.962491 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.962500 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.962514 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:22 crc kubenswrapper[4832]: I1003 00:08:22.962522 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:22Z","lastTransitionTime":"2025-10-03T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.065312 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.065353 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.065361 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.065376 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.065385 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.167395 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.167424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.167432 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.167445 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.167454 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.270158 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.270188 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.270196 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.270208 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.270216 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.373163 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.373224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.373242 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.373265 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.373281 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.385682 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.385745 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:23 crc kubenswrapper[4832]: E1003 00:08:23.385848 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.385882 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:23 crc kubenswrapper[4832]: E1003 00:08:23.386014 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:23 crc kubenswrapper[4832]: E1003 00:08:23.386128 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.386191 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:23 crc kubenswrapper[4832]: E1003 00:08:23.386287 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.475787 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.475823 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.475834 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.475848 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.475857 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.578397 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.578426 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.578434 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.578448 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.578456 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.680680 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.680725 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.680740 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.680789 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.680810 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.784789 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.784825 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.784837 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.784852 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.784863 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.887571 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.887622 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.887639 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.887659 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.887678 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.989894 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.989939 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.989948 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.989964 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:23 crc kubenswrapper[4832]: I1003 00:08:23.989974 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:23Z","lastTransitionTime":"2025-10-03T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.093035 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.093112 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.093131 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.093160 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.093178 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.196183 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.196257 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.196281 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.196305 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.196323 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.299140 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.299195 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.299206 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.299224 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.299238 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.401629 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.401674 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.401685 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.401706 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.401718 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.504876 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.504940 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.504956 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.504977 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.504995 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.608321 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.608387 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.608403 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.608424 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.608441 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.681918 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.681961 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.681975 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.682023 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.682036 4832 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T00:08:24Z","lastTransitionTime":"2025-10-03T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.725547 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9"] Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.726024 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.730326 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.730492 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.730586 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.730820 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.756271 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b9c9g" podStartSLOduration=74.756253766 podStartE2EDuration="1m14.756253766s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.755988637 +0000 UTC m=+98.018143763" watchObservedRunningTime="2025-10-03 00:08:24.756253766 +0000 UTC m=+98.018408872" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.803844 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podStartSLOduration=74.803829924 podStartE2EDuration="1m14.803829924s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.803488782 +0000 UTC m=+98.065643888" watchObservedRunningTime="2025-10-03 00:08:24.803829924 +0000 UTC m=+98.065985030" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.835680 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.835662673 podStartE2EDuration="6.835662673s" podCreationTimestamp="2025-10-03 00:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.819340154 +0000 UTC m=+98.081495260" watchObservedRunningTime="2025-10-03 00:08:24.835662673 +0000 UTC m=+98.097817779" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.845975 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93157f45-a24d-4f7f-9ae2-efacd7091101-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.846039 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93157f45-a24d-4f7f-9ae2-efacd7091101-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.846055 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93157f45-a24d-4f7f-9ae2-efacd7091101-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.846100 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.846130 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.852572 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.852560771 podStartE2EDuration="1m18.852560771s" podCreationTimestamp="2025-10-03 00:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.836445519 +0000 UTC m=+98.098600625" watchObservedRunningTime="2025-10-03 00:08:24.852560771 +0000 UTC m=+98.114715877" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.865118 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.865099442 podStartE2EDuration="1m14.865099442s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.853095328 +0000 UTC m=+98.115250434" watchObservedRunningTime="2025-10-03 00:08:24.865099442 +0000 UTC m=+98.127254548" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.880931 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rwqt7" podStartSLOduration=74.880916332 podStartE2EDuration="1m14.880916332s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.87994526 +0000 UTC m=+98.142100366" watchObservedRunningTime="2025-10-03 00:08:24.880916332 +0000 UTC m=+98.143071438" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.881211 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.881206782 podStartE2EDuration="41.881206782s" podCreationTimestamp="2025-10-03 00:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.865716583 +0000 UTC m=+98.127871689" watchObservedRunningTime="2025-10-03 00:08:24.881206782 +0000 UTC m=+98.143361888" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.903118 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-76ck7" podStartSLOduration=74.903099577 podStartE2EDuration="1m14.903099577s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.892862844 +0000 UTC m=+98.155017950" watchObservedRunningTime="2025-10-03 00:08:24.903099577 +0000 UTC m=+98.165254683" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.903224 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4j5bh" podStartSLOduration=73.903220031 podStartE2EDuration="1m13.903220031s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.903203951 +0000 UTC m=+98.165359057" watchObservedRunningTime="2025-10-03 00:08:24.903220031 +0000 UTC m=+98.165375147" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.945275 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.945258273 podStartE2EDuration="1m14.945258273s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:24.944458597 +0000 UTC m=+98.206613703" watchObservedRunningTime="2025-10-03 00:08:24.945258273 +0000 UTC m=+98.207413389" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946529 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93157f45-a24d-4f7f-9ae2-efacd7091101-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946575 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93157f45-a24d-4f7f-9ae2-efacd7091101-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946597 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93157f45-a24d-4f7f-9ae2-efacd7091101-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946623 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946670 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.946767 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.947908 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/93157f45-a24d-4f7f-9ae2-efacd7091101-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.948270 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93157f45-a24d-4f7f-9ae2-efacd7091101-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.956728 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93157f45-a24d-4f7f-9ae2-efacd7091101-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:24 crc kubenswrapper[4832]: I1003 00:08:24.962799 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93157f45-a24d-4f7f-9ae2-efacd7091101-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dwsk9\" (UID: \"93157f45-a24d-4f7f-9ae2-efacd7091101\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.048251 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.385944 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.386087 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:25 crc kubenswrapper[4832]: E1003 00:08:25.386344 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.386118 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:25 crc kubenswrapper[4832]: E1003 00:08:25.386420 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.386140 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:25 crc kubenswrapper[4832]: E1003 00:08:25.386247 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:25 crc kubenswrapper[4832]: E1003 00:08:25.386467 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.915177 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" event={"ID":"93157f45-a24d-4f7f-9ae2-efacd7091101","Type":"ContainerStarted","Data":"9a294b60c389730945a3476adeeb0b056e8dbea32c0d7567c3355ae1e72543b2"} Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.915236 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" event={"ID":"93157f45-a24d-4f7f-9ae2-efacd7091101","Type":"ContainerStarted","Data":"8e0218a3d36e47ca3950c3823ea4fbaea2e36cabdecae6bcace81a76953b4628"} Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.929376 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dwsk9" podStartSLOduration=75.929359523 podStartE2EDuration="1m15.929359523s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:25.928686 +0000 UTC m=+99.190841116" watchObservedRunningTime="2025-10-03 00:08:25.929359523 +0000 UTC m=+99.191514639" Oct 03 00:08:25 crc kubenswrapper[4832]: I1003 00:08:25.929684 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zdfsz" podStartSLOduration=75.929679754 podStartE2EDuration="1m15.929679754s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:25.076633355 +0000 UTC m=+98.338788461" watchObservedRunningTime="2025-10-03 00:08:25.929679754 +0000 UTC m=+99.191834870" Oct 03 00:08:27 crc kubenswrapper[4832]: I1003 00:08:27.386473 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:27 crc kubenswrapper[4832]: I1003 00:08:27.386566 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:27 crc kubenswrapper[4832]: I1003 00:08:27.386623 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:27 crc kubenswrapper[4832]: E1003 00:08:27.387870 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:27 crc kubenswrapper[4832]: I1003 00:08:27.387947 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:27 crc kubenswrapper[4832]: E1003 00:08:27.388035 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:27 crc kubenswrapper[4832]: E1003 00:08:27.388318 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:27 crc kubenswrapper[4832]: E1003 00:08:27.388453 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:29 crc kubenswrapper[4832]: I1003 00:08:29.389725 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.389986 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:29 crc kubenswrapper[4832]: I1003 00:08:29.390294 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.390381 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:29 crc kubenswrapper[4832]: I1003 00:08:29.390566 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.390654 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:29 crc kubenswrapper[4832]: I1003 00:08:29.390861 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.391059 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:29 crc kubenswrapper[4832]: I1003 00:08:29.602234 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.602487 4832 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:08:29 crc kubenswrapper[4832]: E1003 00:08:29.602639 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs podName:f3e796ab-f354-47c2-8f1b-1d29fe9ab059 nodeName:}" failed. No retries permitted until 2025-10-03 00:09:33.602605003 +0000 UTC m=+166.864760289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs") pod "network-metrics-daemon-ppthn" (UID: "f3e796ab-f354-47c2-8f1b-1d29fe9ab059") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 00:08:30 crc kubenswrapper[4832]: I1003 00:08:30.386822 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:08:30 crc kubenswrapper[4832]: E1003 00:08:30.387913 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zc9xl_openshift-ovn-kubernetes(0f189ec1-8169-4886-bff6-f67f7958050e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" Oct 03 00:08:31 crc kubenswrapper[4832]: I1003 00:08:31.387130 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:31 crc kubenswrapper[4832]: I1003 00:08:31.387300 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:31 crc kubenswrapper[4832]: I1003 00:08:31.387423 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:31 crc kubenswrapper[4832]: E1003 00:08:31.387412 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:31 crc kubenswrapper[4832]: E1003 00:08:31.387629 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:31 crc kubenswrapper[4832]: I1003 00:08:31.387730 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:31 crc kubenswrapper[4832]: E1003 00:08:31.387866 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:31 crc kubenswrapper[4832]: E1003 00:08:31.387968 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:33 crc kubenswrapper[4832]: I1003 00:08:33.386452 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:33 crc kubenswrapper[4832]: I1003 00:08:33.386516 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:33 crc kubenswrapper[4832]: I1003 00:08:33.386495 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:33 crc kubenswrapper[4832]: I1003 00:08:33.386488 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:33 crc kubenswrapper[4832]: E1003 00:08:33.386807 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:33 crc kubenswrapper[4832]: E1003 00:08:33.386907 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:33 crc kubenswrapper[4832]: E1003 00:08:33.387042 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:33 crc kubenswrapper[4832]: E1003 00:08:33.387127 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:35 crc kubenswrapper[4832]: I1003 00:08:35.385929 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:35 crc kubenswrapper[4832]: I1003 00:08:35.385986 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:35 crc kubenswrapper[4832]: I1003 00:08:35.386185 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:35 crc kubenswrapper[4832]: E1003 00:08:35.386328 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:35 crc kubenswrapper[4832]: I1003 00:08:35.386640 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:35 crc kubenswrapper[4832]: E1003 00:08:35.386739 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:35 crc kubenswrapper[4832]: E1003 00:08:35.386888 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:35 crc kubenswrapper[4832]: E1003 00:08:35.386936 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:37 crc kubenswrapper[4832]: I1003 00:08:37.386061 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:37 crc kubenswrapper[4832]: I1003 00:08:37.386068 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:37 crc kubenswrapper[4832]: I1003 00:08:37.386068 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:37 crc kubenswrapper[4832]: I1003 00:08:37.386150 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:37 crc kubenswrapper[4832]: E1003 00:08:37.386967 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:37 crc kubenswrapper[4832]: E1003 00:08:37.387150 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:37 crc kubenswrapper[4832]: E1003 00:08:37.387197 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:37 crc kubenswrapper[4832]: E1003 00:08:37.387318 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:39 crc kubenswrapper[4832]: I1003 00:08:39.386386 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:39 crc kubenswrapper[4832]: I1003 00:08:39.386515 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:39 crc kubenswrapper[4832]: I1003 00:08:39.386367 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:39 crc kubenswrapper[4832]: E1003 00:08:39.386573 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:39 crc kubenswrapper[4832]: I1003 00:08:39.386416 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:39 crc kubenswrapper[4832]: E1003 00:08:39.393338 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:39 crc kubenswrapper[4832]: E1003 00:08:39.393528 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:39 crc kubenswrapper[4832]: E1003 00:08:39.393716 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:41 crc kubenswrapper[4832]: I1003 00:08:41.386044 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:41 crc kubenswrapper[4832]: I1003 00:08:41.386154 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:41 crc kubenswrapper[4832]: E1003 00:08:41.386313 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:41 crc kubenswrapper[4832]: E1003 00:08:41.386412 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:41 crc kubenswrapper[4832]: I1003 00:08:41.386962 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:41 crc kubenswrapper[4832]: I1003 00:08:41.387012 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:41 crc kubenswrapper[4832]: E1003 00:08:41.387204 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:41 crc kubenswrapper[4832]: E1003 00:08:41.387351 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:43 crc kubenswrapper[4832]: I1003 00:08:43.386415 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:43 crc kubenswrapper[4832]: I1003 00:08:43.386541 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:43 crc kubenswrapper[4832]: E1003 00:08:43.386605 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:43 crc kubenswrapper[4832]: I1003 00:08:43.386636 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:43 crc kubenswrapper[4832]: I1003 00:08:43.386788 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:43 crc kubenswrapper[4832]: E1003 00:08:43.386844 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:43 crc kubenswrapper[4832]: E1003 00:08:43.386939 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:43 crc kubenswrapper[4832]: E1003 00:08:43.387092 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.984132 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/1.log" Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.984621 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/0.log" Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.984672 4832 generic.go:334] "Generic (PLEG): container finished" podID="96bf1ee1-9249-4db9-83da-17eb15aa5f9a" containerID="d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3" exitCode=1 Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.984705 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerDied","Data":"d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3"} Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.984746 4832 scope.go:117] "RemoveContainer" containerID="778c9afa5747e424ff72688b9d90dd21e3faec1f409c2b9683b5a16a37685369" Oct 03 00:08:44 crc kubenswrapper[4832]: I1003 00:08:44.985148 4832 scope.go:117] "RemoveContainer" containerID="d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3" Oct 03 00:08:44 crc kubenswrapper[4832]: E1003 00:08:44.985332 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-rwqt7_openshift-multus(96bf1ee1-9249-4db9-83da-17eb15aa5f9a)\"" pod="openshift-multus/multus-rwqt7" podUID="96bf1ee1-9249-4db9-83da-17eb15aa5f9a" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.386326 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.386400 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.386418 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:45 crc kubenswrapper[4832]: E1003 00:08:45.386450 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.386333 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:45 crc kubenswrapper[4832]: E1003 00:08:45.386816 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:45 crc kubenswrapper[4832]: E1003 00:08:45.387085 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:45 crc kubenswrapper[4832]: E1003 00:08:45.387162 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.388230 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.989780 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/3.log" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.992783 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerStarted","Data":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.993145 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:08:45 crc kubenswrapper[4832]: I1003 00:08:45.994379 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/1.log" Oct 03 00:08:46 crc kubenswrapper[4832]: I1003 00:08:46.207185 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podStartSLOduration=96.207158406 podStartE2EDuration="1m36.207158406s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:08:46.053584789 +0000 UTC m=+119.315739905" watchObservedRunningTime="2025-10-03 00:08:46.207158406 +0000 UTC m=+119.469313522" Oct 03 00:08:46 crc kubenswrapper[4832]: I1003 00:08:46.208712 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ppthn"] Oct 03 00:08:46 crc kubenswrapper[4832]: I1003 00:08:46.208868 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:46 crc kubenswrapper[4832]: E1003 00:08:46.209014 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:47 crc kubenswrapper[4832]: I1003 00:08:47.386279 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:47 crc kubenswrapper[4832]: I1003 00:08:47.386279 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:47 crc kubenswrapper[4832]: I1003 00:08:47.386403 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:47 crc kubenswrapper[4832]: E1003 00:08:47.387230 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:47 crc kubenswrapper[4832]: E1003 00:08:47.387329 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:47 crc kubenswrapper[4832]: E1003 00:08:47.387531 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:47 crc kubenswrapper[4832]: E1003 00:08:47.410220 4832 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 00:08:47 crc kubenswrapper[4832]: E1003 00:08:47.481534 4832 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 00:08:48 crc kubenswrapper[4832]: I1003 00:08:48.385730 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:48 crc kubenswrapper[4832]: E1003 00:08:48.385899 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:49 crc kubenswrapper[4832]: I1003 00:08:49.386275 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:49 crc kubenswrapper[4832]: I1003 00:08:49.386309 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:49 crc kubenswrapper[4832]: E1003 00:08:49.386427 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:49 crc kubenswrapper[4832]: I1003 00:08:49.386309 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:49 crc kubenswrapper[4832]: E1003 00:08:49.386497 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:49 crc kubenswrapper[4832]: E1003 00:08:49.386563 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:50 crc kubenswrapper[4832]: I1003 00:08:50.386351 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:50 crc kubenswrapper[4832]: E1003 00:08:50.386516 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:51 crc kubenswrapper[4832]: I1003 00:08:51.385992 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:51 crc kubenswrapper[4832]: E1003 00:08:51.386337 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:51 crc kubenswrapper[4832]: I1003 00:08:51.386036 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:51 crc kubenswrapper[4832]: E1003 00:08:51.386398 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:51 crc kubenswrapper[4832]: I1003 00:08:51.385998 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:51 crc kubenswrapper[4832]: E1003 00:08:51.386713 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:52 crc kubenswrapper[4832]: I1003 00:08:52.386473 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:52 crc kubenswrapper[4832]: E1003 00:08:52.386642 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:52 crc kubenswrapper[4832]: E1003 00:08:52.482679 4832 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 00:08:53 crc kubenswrapper[4832]: I1003 00:08:53.385817 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:53 crc kubenswrapper[4832]: I1003 00:08:53.385851 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:53 crc kubenswrapper[4832]: E1003 00:08:53.385995 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:53 crc kubenswrapper[4832]: I1003 00:08:53.386066 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:53 crc kubenswrapper[4832]: E1003 00:08:53.386963 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:53 crc kubenswrapper[4832]: E1003 00:08:53.387051 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:54 crc kubenswrapper[4832]: I1003 00:08:54.385536 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:54 crc kubenswrapper[4832]: E1003 00:08:54.385692 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:55 crc kubenswrapper[4832]: I1003 00:08:55.386376 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:55 crc kubenswrapper[4832]: I1003 00:08:55.386438 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:55 crc kubenswrapper[4832]: E1003 00:08:55.386533 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:55 crc kubenswrapper[4832]: I1003 00:08:55.386581 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:55 crc kubenswrapper[4832]: E1003 00:08:55.386676 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:55 crc kubenswrapper[4832]: E1003 00:08:55.386776 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:56 crc kubenswrapper[4832]: I1003 00:08:56.386438 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:56 crc kubenswrapper[4832]: E1003 00:08:56.386603 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:57 crc kubenswrapper[4832]: I1003 00:08:57.385965 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:57 crc kubenswrapper[4832]: I1003 00:08:57.386020 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:57 crc kubenswrapper[4832]: I1003 00:08:57.386031 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:57 crc kubenswrapper[4832]: E1003 00:08:57.387991 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:57 crc kubenswrapper[4832]: E1003 00:08:57.388087 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:08:57 crc kubenswrapper[4832]: E1003 00:08:57.388228 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:57 crc kubenswrapper[4832]: E1003 00:08:57.483523 4832 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 00:08:58 crc kubenswrapper[4832]: I1003 00:08:58.386138 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:08:58 crc kubenswrapper[4832]: E1003 00:08:58.386337 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:08:59 crc kubenswrapper[4832]: I1003 00:08:59.385869 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:08:59 crc kubenswrapper[4832]: I1003 00:08:59.385997 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:08:59 crc kubenswrapper[4832]: I1003 00:08:59.386110 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:08:59 crc kubenswrapper[4832]: E1003 00:08:59.386101 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:08:59 crc kubenswrapper[4832]: E1003 00:08:59.386272 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:08:59 crc kubenswrapper[4832]: E1003 00:08:59.386369 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:09:00 crc kubenswrapper[4832]: I1003 00:09:00.385810 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:00 crc kubenswrapper[4832]: E1003 00:09:00.386061 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:09:00 crc kubenswrapper[4832]: I1003 00:09:00.386268 4832 scope.go:117] "RemoveContainer" containerID="d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3" Oct 03 00:09:01 crc kubenswrapper[4832]: I1003 00:09:01.044393 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/1.log" Oct 03 00:09:01 crc kubenswrapper[4832]: I1003 00:09:01.044772 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerStarted","Data":"f7b3d48173bbf797245933ace78f31ccd4545faddc97703430950e2068752402"} Oct 03 00:09:01 crc kubenswrapper[4832]: I1003 00:09:01.386050 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:09:01 crc kubenswrapper[4832]: I1003 00:09:01.386122 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:01 crc kubenswrapper[4832]: E1003 00:09:01.386183 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 00:09:01 crc kubenswrapper[4832]: E1003 00:09:01.386325 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 00:09:01 crc kubenswrapper[4832]: I1003 00:09:01.386377 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:01 crc kubenswrapper[4832]: E1003 00:09:01.386443 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 00:09:02 crc kubenswrapper[4832]: I1003 00:09:02.386481 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:02 crc kubenswrapper[4832]: E1003 00:09:02.386661 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ppthn" podUID="f3e796ab-f354-47c2-8f1b-1d29fe9ab059" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.385890 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.385942 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.386108 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.387803 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.387809 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.388304 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 00:09:03 crc kubenswrapper[4832]: I1003 00:09:03.388730 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 00:09:04 crc kubenswrapper[4832]: I1003 00:09:04.385997 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:04 crc kubenswrapper[4832]: I1003 00:09:04.388163 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 00:09:04 crc kubenswrapper[4832]: I1003 00:09:04.388311 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.737323 4832 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.776917 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.777382 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qpxvt"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.777830 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v4h7f"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.777938 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.777937 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.779472 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.779850 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.780092 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.782176 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.782332 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kjfxn"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.782832 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29324160-vrdgr"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.782892 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.783445 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-znmk2"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.782847 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.784185 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.784356 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.784550 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.785447 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.810551 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.810934 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.817253 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.823197 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.828975 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-encryption-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829043 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829071 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhx9\" (UniqueName: \"kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829215 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln8tt\" (UniqueName: \"kubernetes.io/projected/6f13893c-16c4-4867-a020-3c8868b11859-kube-api-access-ln8tt\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829240 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829266 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47qj\" (UniqueName: \"kubernetes.io/projected/a23d32e8-79df-45c1-8221-f714cbfb4a15-kube-api-access-q47qj\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829296 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-serving-cert\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829395 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-client\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829424 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a4d831-a952-4290-9025-dffe5d65890e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829465 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98bpm\" (UniqueName: \"kubernetes.io/projected/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-kube-api-access-98bpm\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829548 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829576 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-service-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829622 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-images\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829656 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5bk\" (UniqueName: \"kubernetes.io/projected/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-kube-api-access-kt5bk\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829682 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit-dir\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829714 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829738 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829786 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-config\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829814 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829839 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-image-import-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829947 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a4d831-a952-4290-9025-dffe5d65890e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.829996 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830015 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-node-pullsecrets\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830153 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-config\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830192 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830228 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f13893c-16c4-4867-a020-3c8868b11859-serving-cert\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830266 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-serving-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830414 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9prv\" (UniqueName: \"kubernetes.io/projected/20a4d831-a952-4290-9025-dffe5d65890e-kube-api-access-n9prv\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830459 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.830596 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.831397 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.835780 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.840535 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.842577 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.842658 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.842798 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.843157 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.843876 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.843989 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844177 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844232 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844277 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844324 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844374 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844411 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844450 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844559 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844616 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.844798 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.845051 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850087 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850167 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850341 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850431 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850458 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850549 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850638 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850693 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850575 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850771 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850871 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.850918 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.851006 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.851017 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.851095 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.851336 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.851891 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.852224 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.852344 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.852461 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.852586 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.852712 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.853644 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.854819 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.855079 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.856724 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.856904 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.869360 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hxv6p"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.870068 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.870375 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.870493 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.870601 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.870792 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.871313 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.871429 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.872053 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.872294 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.881241 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.881276 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.883777 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.884871 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.884880 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.885102 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.887397 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.887487 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.888091 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.888191 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.892529 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.892720 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4wx4"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.893531 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vgm2t"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.896063 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.898058 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.921898 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.922409 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.923236 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zg4z4"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.923828 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.924348 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.925997 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.926619 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.926935 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.927502 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.927829 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.928164 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6v588"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.929603 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.930159 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.929287 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933074 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931386 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931322 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931407 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933788 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933857 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933900 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-encryption-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933941 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.933978 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.934015 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhx9\" (UniqueName: \"kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.934096 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjg2h\" (UniqueName: \"kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.934127 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931706 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931857 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.931961 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.932079 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.932362 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.935307 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w8cn7"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.932672 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.935870 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln8tt\" (UniqueName: \"kubernetes.io/projected/6f13893c-16c4-4867-a020-3c8868b11859-kube-api-access-ln8tt\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.935918 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47qj\" (UniqueName: \"kubernetes.io/projected/a23d32e8-79df-45c1-8221-f714cbfb4a15-kube-api-access-q47qj\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936039 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7367a27e-1830-4f44-b9c2-543541bbb794-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936079 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-serving-cert\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936111 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtq5\" (UniqueName: \"kubernetes.io/projected/c73e2dc6-6174-4a76-a7d2-192ef93b7935-kube-api-access-jxtq5\") pod \"downloads-7954f5f757-znmk2\" (UID: \"c73e2dc6-6174-4a76-a7d2-192ef93b7935\") " pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936157 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936197 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936237 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936273 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drj7g\" (UniqueName: \"kubernetes.io/projected/a3407ec5-4299-4c9d-9b37-64448de352b7-kube-api-access-drj7g\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936305 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936594 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936614 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.936791 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-client\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939018 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a4d831-a952-4290-9025-dffe5d65890e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939050 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939120 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/17d5f7b6-d924-4095-a636-4f283d40d2e8-machine-approver-tls\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939142 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndd5\" (UniqueName: \"kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939189 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98bpm\" (UniqueName: \"kubernetes.io/projected/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-kube-api-access-98bpm\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939524 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939568 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-service-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939601 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j75j\" (UniqueName: \"kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939648 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dqv\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-kube-api-access-l9dqv\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939679 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-oauth-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939796 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939819 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7367a27e-1830-4f44-b9c2-543541bbb794-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939879 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-images\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939899 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939942 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-trusted-ca-bundle\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.939969 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit-dir\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940008 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-auth-proxy-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940060 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5bk\" (UniqueName: \"kubernetes.io/projected/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-kube-api-access-kt5bk\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940138 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940182 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940500 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940556 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66whk\" (UniqueName: \"kubernetes.io/projected/17d5f7b6-d924-4095-a636-4f283d40d2e8-kube-api-access-66whk\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940615 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-config\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940659 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940703 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-service-ca\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940747 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-console-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940838 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940920 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-image-import-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.940970 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a4d831-a952-4290-9025-dffe5d65890e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941064 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941140 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941210 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941241 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-node-pullsecrets\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941259 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937601 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941325 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-config\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941378 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941415 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941471 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f13893c-16c4-4867-a020-3c8868b11859-serving-cert\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941491 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-serving-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941534 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941557 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-oauth-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937213 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941627 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9prv\" (UniqueName: \"kubernetes.io/projected/20a4d831-a952-4290-9025-dffe5d65890e-kube-api-access-n9prv\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937034 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.943301 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.943643 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.943608 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-audit-dir\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.943897 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-service-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.941654 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.943915 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a23d32e8-79df-45c1-8221-f714cbfb4a15-node-pullsecrets\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937260 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.944692 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-images\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937956 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.945382 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.945699 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.937146 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.946609 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a4d831-a952-4290-9025-dffe5d65890e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.946615 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f13893c-16c4-4867-a020-3c8868b11859-config\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938032 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.946711 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938114 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938188 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938324 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938390 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938481 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.947358 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-client\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938525 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.947405 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938594 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938725 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.938816 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.947739 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.948025 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-etcd-serving-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.948187 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-config\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.948984 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.953364 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-serving-cert\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.965785 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.966353 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a23d32e8-79df-45c1-8221-f714cbfb4a15-encryption-config\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.967209 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f13893c-16c4-4867-a020-3c8868b11859-serving-cert\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.970576 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971214 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971217 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971356 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20a4d831-a952-4290-9025-dffe5d65890e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971466 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971577 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971642 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972014 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972123 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a23d32e8-79df-45c1-8221-f714cbfb4a15-image-import-ca\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972212 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972219 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972355 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972491 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972585 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972700 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.972779 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.973602 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.974198 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.974231 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.974473 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.974960 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.971364 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.975480 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.976409 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.976662 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.978159 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.978255 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.978389 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.978714 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.979000 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.982188 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.984461 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.987186 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.987480 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.989641 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.991115 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xtqlf"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.991642 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.991737 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.994208 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zzj42"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.994709 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.994867 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.995130 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.995376 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m9qgv"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.995791 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.998036 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.999518 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2"] Oct 03 00:09:05 crc kubenswrapper[4832]: I1003 00:09:05.999870 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nmggh"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.000254 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.000307 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.000555 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.000713 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.000723 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.001456 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.003984 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.004771 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.009300 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.013611 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29324160-vrdgr"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.013654 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.013670 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v4h7f"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.013687 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qpxvt"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.013827 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.014335 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-znmk2"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.014957 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.016070 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kjfxn"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.021394 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.025732 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vgm2t"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.030387 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g4wns"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.034272 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.036137 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.037294 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zg4z4"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.038343 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.039417 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hxv6p"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.040251 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.040494 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.041907 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6v588"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.042622 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.043547 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.044546 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4wx4"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.045633 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zzj42"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.046696 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047532 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047678 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047592 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtq5\" (UniqueName: \"kubernetes.io/projected/c73e2dc6-6174-4a76-a7d2-192ef93b7935-kube-api-access-jxtq5\") pod \"downloads-7954f5f757-znmk2\" (UID: \"c73e2dc6-6174-4a76-a7d2-192ef93b7935\") " pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047780 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7367a27e-1830-4f44-b9c2-543541bbb794-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047831 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047862 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047897 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047927 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drj7g\" (UniqueName: \"kubernetes.io/projected/a3407ec5-4299-4c9d-9b37-64448de352b7-kube-api-access-drj7g\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047954 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.047979 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048012 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048046 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb2hm\" (UniqueName: \"kubernetes.io/projected/ca8202c0-14e2-4221-9cdb-e5439ba59058-kube-api-access-gb2hm\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048087 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/17d5f7b6-d924-4095-a636-4f283d40d2e8-machine-approver-tls\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048116 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndd5\" (UniqueName: \"kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048019 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048165 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j75j\" (UniqueName: \"kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048243 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd7g9\" (UniqueName: \"kubernetes.io/projected/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-kube-api-access-vd7g9\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048280 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dqv\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-kube-api-access-l9dqv\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048321 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-oauth-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048378 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048426 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048460 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7367a27e-1830-4f44-b9c2-543541bbb794-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048499 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-trusted-ca-bundle\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048535 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-auth-proxy-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048576 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048605 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66whk\" (UniqueName: \"kubernetes.io/projected/17d5f7b6-d924-4095-a636-4f283d40d2e8-kube-api-access-66whk\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048634 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048659 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-service-ca\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048688 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca8202c0-14e2-4221-9cdb-e5439ba59058-metrics-tls\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048721 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-console-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048790 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048852 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048821 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048928 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048947 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.048978 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049012 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049032 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049051 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-oauth-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049095 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049127 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049153 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049180 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjg2h\" (UniqueName: \"kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049214 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.049328 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7367a27e-1830-4f44-b9c2-543541bbb794-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.050711 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.050800 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051111 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-service-ca\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051212 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-auth-proxy-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051231 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051500 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051575 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/17d5f7b6-d924-4095-a636-4f283d40d2e8-machine-approver-tls\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.051709 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-console-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.052152 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.052353 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.052403 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.052595 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-oauth-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.052885 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d5f7b6-d924-4095-a636-4f283d40d2e8-config\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053001 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053014 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053137 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3407ec5-4299-4c9d-9b37-64448de352b7-trusted-ca-bundle\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053531 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053572 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.053664 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-oauth-config\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.054823 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.055858 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3407ec5-4299-4c9d-9b37-64448de352b7-console-serving-cert\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.055929 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.055993 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.057140 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7367a27e-1830-4f44-b9c2-543541bbb794-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.058395 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.059607 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qcrcz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.060590 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.060668 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.062269 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.063337 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.064300 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.065333 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xtqlf"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.066371 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.067376 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.068337 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-svzx5"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.069306 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.069345 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nmggh"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.070600 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m9qgv"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.073165 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qcrcz"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.074240 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-svzx5"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.075214 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln8tt\" (UniqueName: \"kubernetes.io/projected/6f13893c-16c4-4867-a020-3c8868b11859-kube-api-access-ln8tt\") pod \"authentication-operator-69f744f599-kjfxn\" (UID: \"6f13893c-16c4-4867-a020-3c8868b11859\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.075587 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.075653 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.075662 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.076098 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.076645 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g59tb"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.076729 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.077349 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.078521 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g59tb"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.079507 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.094594 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47qj\" (UniqueName: \"kubernetes.io/projected/a23d32e8-79df-45c1-8221-f714cbfb4a15-kube-api-access-q47qj\") pod \"apiserver-76f77b778f-qpxvt\" (UID: \"a23d32e8-79df-45c1-8221-f714cbfb4a15\") " pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.098399 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.116962 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhx9\" (UniqueName: \"kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9\") pod \"controller-manager-879f6c89f-r6kz6\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.133878 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.149954 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb2hm\" (UniqueName: \"kubernetes.io/projected/ca8202c0-14e2-4221-9cdb-e5439ba59058-kube-api-access-gb2hm\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.150032 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd7g9\" (UniqueName: \"kubernetes.io/projected/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-kube-api-access-vd7g9\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.150120 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca8202c0-14e2-4221-9cdb-e5439ba59058-metrics-tls\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.150156 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.154643 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca8202c0-14e2-4221-9cdb-e5439ba59058-metrics-tls\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.155409 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98bpm\" (UniqueName: \"kubernetes.io/projected/ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca-kube-api-access-98bpm\") pod \"cluster-samples-operator-665b6dd947-dpskq\" (UID: \"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.171216 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.176353 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.176610 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.198150 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.200551 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.205124 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9prv\" (UniqueName: \"kubernetes.io/projected/20a4d831-a952-4290-9025-dffe5d65890e-kube-api-access-n9prv\") pod \"openshift-apiserver-operator-796bbdcf4f-ps468\" (UID: \"20a4d831-a952-4290-9025-dffe5d65890e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.221420 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.229078 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.242225 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.260818 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.281497 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.302340 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.342633 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.349303 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5bk\" (UniqueName: \"kubernetes.io/projected/bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8-kube-api-access-kt5bk\") pod \"machine-api-operator-5694c8668f-v4h7f\" (UID: \"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.361100 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.382495 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.401442 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.420331 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.428594 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qpxvt"] Oct 03 00:09:06 crc kubenswrapper[4832]: W1003 00:09:06.434374 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23d32e8_79df_45c1_8221_f714cbfb4a15.slice/crio-9b96aa7140cfdc6b582d2155b39742b0d2f133b32a02b22f37ecd8273a08a67c WatchSource:0}: Error finding container 9b96aa7140cfdc6b582d2155b39742b0d2f133b32a02b22f37ecd8273a08a67c: Status 404 returned error can't find the container with id 9b96aa7140cfdc6b582d2155b39742b0d2f133b32a02b22f37ecd8273a08a67c Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.441866 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.460295 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.480718 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.481101 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.490434 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.500514 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.520328 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.540526 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.561683 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.583131 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.600286 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.622254 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.641607 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.646973 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v4h7f"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.659813 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.660315 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.665326 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.669572 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kjfxn"] Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.680496 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.700083 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 00:09:06 crc kubenswrapper[4832]: W1003 00:09:06.704801 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3f4c9e_e9cb_4188_9a87_8bdefa5e0ee8.slice/crio-ae1298529203679154b330969a04e7953e210487b57f7e48cf6b4566fff1df91 WatchSource:0}: Error finding container ae1298529203679154b330969a04e7953e210487b57f7e48cf6b4566fff1df91: Status 404 returned error can't find the container with id ae1298529203679154b330969a04e7953e210487b57f7e48cf6b4566fff1df91 Oct 03 00:09:06 crc kubenswrapper[4832]: W1003 00:09:06.705255 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f13893c_16c4_4867_a020_3c8868b11859.slice/crio-4c596efbce9ee2a28403cc2562c0da08a76a1b3528a92e60d4507c9f5b740585 WatchSource:0}: Error finding container 4c596efbce9ee2a28403cc2562c0da08a76a1b3528a92e60d4507c9f5b740585: Status 404 returned error can't find the container with id 4c596efbce9ee2a28403cc2562c0da08a76a1b3528a92e60d4507c9f5b740585 Oct 03 00:09:06 crc kubenswrapper[4832]: W1003 00:09:06.706916 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f97b67_d300_4ef4_8951_934c01b68892.slice/crio-9ba7a15efaa47c8f412ab0847791326a696d4804ea562b9bc17428a3be6238fe WatchSource:0}: Error finding container 9ba7a15efaa47c8f412ab0847791326a696d4804ea562b9bc17428a3be6238fe: Status 404 returned error can't find the container with id 9ba7a15efaa47c8f412ab0847791326a696d4804ea562b9bc17428a3be6238fe Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.720449 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.743577 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.760930 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.781212 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.800437 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.827163 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.840831 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.860318 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.880542 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.900997 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.921945 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.960856 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.980500 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 00:09:06 crc kubenswrapper[4832]: I1003 00:09:06.999475 4832 request.go:700] Waited for 1.007290604s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.001413 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.020995 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.041661 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.061462 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.065677 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" event={"ID":"69f97b67-d300-4ef4-8951-934c01b68892","Type":"ContainerStarted","Data":"1928c944b1ca0615caa4bcfd67fc4354e5b11f82514ddfc974f8838458f34d63"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.065930 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" event={"ID":"69f97b67-d300-4ef4-8951-934c01b68892","Type":"ContainerStarted","Data":"9ba7a15efaa47c8f412ab0847791326a696d4804ea562b9bc17428a3be6238fe"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.066004 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.067362 4832 generic.go:334] "Generic (PLEG): container finished" podID="a23d32e8-79df-45c1-8221-f714cbfb4a15" containerID="3b7fbd2bae210523bcd8c85a688bf3cb56430394c65ead48f8c9a47591facb6e" exitCode=0 Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.067413 4832 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r6kz6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.067969 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.067451 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" event={"ID":"a23d32e8-79df-45c1-8221-f714cbfb4a15","Type":"ContainerDied","Data":"3b7fbd2bae210523bcd8c85a688bf3cb56430394c65ead48f8c9a47591facb6e"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.068170 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" event={"ID":"a23d32e8-79df-45c1-8221-f714cbfb4a15","Type":"ContainerStarted","Data":"9b96aa7140cfdc6b582d2155b39742b0d2f133b32a02b22f37ecd8273a08a67c"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.069456 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" event={"ID":"20a4d831-a952-4290-9025-dffe5d65890e","Type":"ContainerStarted","Data":"ad34c1a41236bbc4b7bc95d67a26bbef5403cea358a94181e1eea2df548ae0bb"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.069508 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" event={"ID":"20a4d831-a952-4290-9025-dffe5d65890e","Type":"ContainerStarted","Data":"41fade1b602123157310d88d96f0c3718ba6aaa4320966f877e709dc181a4bf5"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.070849 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" event={"ID":"6f13893c-16c4-4867-a020-3c8868b11859","Type":"ContainerStarted","Data":"2b764a7c5ac47d7e3f7d8606f2f396cccbc7713f3e1b9a82caac94a87eae747e"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.070877 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" event={"ID":"6f13893c-16c4-4867-a020-3c8868b11859","Type":"ContainerStarted","Data":"4c596efbce9ee2a28403cc2562c0da08a76a1b3528a92e60d4507c9f5b740585"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.073542 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" event={"ID":"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8","Type":"ContainerStarted","Data":"51b257384ae2164bee10b1afe105ddafeb445d1c4fcab466297ab9715e5e6aa8"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.073571 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" event={"ID":"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8","Type":"ContainerStarted","Data":"2f42caf922e91aee3ac81f6a846c4063a74acdec41fe145a010d80ae1f726592"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.073580 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" event={"ID":"bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8","Type":"ContainerStarted","Data":"ae1298529203679154b330969a04e7953e210487b57f7e48cf6b4566fff1df91"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.075046 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" event={"ID":"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca","Type":"ContainerStarted","Data":"cee9a2c161141ffc3d687ad38d55738eaae9d576f44135779e2f07bf128f2163"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.075073 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" event={"ID":"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca","Type":"ContainerStarted","Data":"388a1b7bb7cb253f4f2d29bd09686270cf8357e9e8c5966573d1899ac19b7adc"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.075082 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" event={"ID":"ac043bfa-15d5-4a58-a9bc-0601bfd7a7ca","Type":"ContainerStarted","Data":"67e44ac116ea7d500dfb29be22d3c64bd00b35d60f31710bc723275ab033ecf5"} Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.080002 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.101190 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.121891 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.141477 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.161208 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.184586 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.201272 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.220890 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.244536 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.260936 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.279918 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.303943 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.321479 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.342335 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.363173 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.381820 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.401255 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.421539 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.440350 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.461896 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.480597 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.500248 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.521418 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.541267 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.560406 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.580223 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.601064 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.620353 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.640596 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.660143 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.681001 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.700874 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.722432 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.741375 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.760848 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.781095 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.825962 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drj7g\" (UniqueName: \"kubernetes.io/projected/a3407ec5-4299-4c9d-9b37-64448de352b7-kube-api-access-drj7g\") pod \"console-f9d7485db-hxv6p\" (UID: \"a3407ec5-4299-4c9d-9b37-64448de352b7\") " pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.835231 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.842507 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j75j\" (UniqueName: \"kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j\") pod \"image-pruner-29324160-vrdgr\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.857643 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndd5\" (UniqueName: \"kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5\") pod \"oauth-openshift-558db77b4-sl4zz\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.879419 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjg2h\" (UniqueName: \"kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h\") pod \"route-controller-manager-6576b87f9c-hqmpt\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.899356 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.911232 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.928748 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtq5\" (UniqueName: \"kubernetes.io/projected/c73e2dc6-6174-4a76-a7d2-192ef93b7935-kube-api-access-jxtq5\") pod \"downloads-7954f5f757-znmk2\" (UID: \"c73e2dc6-6174-4a76-a7d2-192ef93b7935\") " pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.945499 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dqv\" (UniqueName: \"kubernetes.io/projected/7367a27e-1830-4f44-b9c2-543541bbb794-kube-api-access-l9dqv\") pod \"cluster-image-registry-operator-dc59b4c8b-xjqmz\" (UID: \"7367a27e-1830-4f44-b9c2-543541bbb794\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.966107 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.974242 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66whk\" (UniqueName: \"kubernetes.io/projected/17d5f7b6-d924-4095-a636-4f283d40d2e8-kube-api-access-66whk\") pod \"machine-approver-56656f9798-bbh52\" (UID: \"17d5f7b6-d924-4095-a636-4f283d40d2e8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:07 crc kubenswrapper[4832]: I1003 00:09:07.982428 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.000897 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.019120 4832 request.go:700] Waited for 1.958183813s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.020667 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.040965 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.060741 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.084115 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.097610 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.098238 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hxv6p"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.101459 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.104396 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" event={"ID":"a23d32e8-79df-45c1-8221-f714cbfb4a15","Type":"ContainerStarted","Data":"87eea0e42967a55c579ee9647866c34eaeae0e86d47d92c1429f72384bfcc8c1"} Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.104466 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" event={"ID":"a23d32e8-79df-45c1-8221-f714cbfb4a15","Type":"ContainerStarted","Data":"a995961cd46296df213843e6a1091228cb3905b52aa84bdd6314dbc39d2ff766"} Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.110554 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.118903 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.123736 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.127266 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.131088 4832 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.145444 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.146447 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.194846 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29324160-vrdgr"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.234242 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd7g9\" (UniqueName: \"kubernetes.io/projected/02e3e27d-2b35-47ed-a82f-686e4ef7dfd8-kube-api-access-vd7g9\") pod \"control-plane-machine-set-operator-78cbb6b69f-9kl2v\" (UID: \"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.264276 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb2hm\" (UniqueName: \"kubernetes.io/projected/ca8202c0-14e2-4221-9cdb-e5439ba59058-kube-api-access-gb2hm\") pod \"dns-operator-744455d44c-vgm2t\" (UID: \"ca8202c0-14e2-4221-9cdb-e5439ba59058\") " pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.280475 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpskq" podStartSLOduration=118.280453466 podStartE2EDuration="1m58.280453466s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:08.278600214 +0000 UTC m=+141.540755320" watchObservedRunningTime="2025-10-03 00:09:08.280453466 +0000 UTC m=+141.542608572" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287532 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287595 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445ce5e-0123-4888-a603-9802c94e9c09-serving-cert\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287619 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed757b20-1bd6-4a44-a75a-f1698623d397-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287639 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-client\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287659 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287709 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-stats-auth\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287725 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxzh\" (UniqueName: \"kubernetes.io/projected/127e79f7-9345-4611-b7eb-daf32a27f413-kube-api-access-pnxzh\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287807 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287825 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8315e005-d902-4564-a12a-83dfe30643f2-proxy-tls\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.287891 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-config\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.289458 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqc6m\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290218 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-service-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290345 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bebd5-fb32-4e26-8a34-2efba789ef00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290478 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhwfw\" (UniqueName: \"kubernetes.io/projected/ed946177-7292-4d1e-8d89-a7e383701224-kube-api-access-mhwfw\") pod \"migrator-59844c95c7-wml5t\" (UID: \"ed946177-7292-4d1e-8d89-a7e383701224\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290565 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ae2a-3f96-435a-9104-64d8db170233-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290651 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-serving-cert\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290937 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c106f56-cda7-41ea-9c9b-9b797ccf621b-metrics-tls\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.290997 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b771c8-c171-49ac-a805-c13f3fc308e5-config\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.291033 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8m9z\" (UniqueName: \"kubernetes.io/projected/8445ce5e-0123-4888-a603-9802c94e9c09-kube-api-access-c8m9z\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.291056 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbzn\" (UniqueName: \"kubernetes.io/projected/9157b726-b8a9-497e-b833-8612e28868d1-kube-api-access-tlbzn\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.292380 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjzk\" (UniqueName: \"kubernetes.io/projected/c085ae2a-3f96-435a-9104-64d8db170233-kube-api-access-fnjzk\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.329345 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9157b726-b8a9-497e-b833-8612e28868d1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.332548 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b771c8-c171-49ac-a805-c13f3fc308e5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.332612 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28399927-5270-4b93-ae66-5c08a318b376-service-ca-bundle\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.332704 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.333150 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.333986 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:08.833955655 +0000 UTC m=+142.096110761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.336172 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49b771c8-c171-49ac-a805-c13f3fc308e5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.336561 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-images\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.336597 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7q2c\" (UniqueName: \"kubernetes.io/projected/8315e005-d902-4564-a12a-83dfe30643f2-kube-api-access-j7q2c\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338223 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338268 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42qsw\" (UniqueName: \"kubernetes.io/projected/19441502-65d7-49aa-adc0-1689bbecd75e-kube-api-access-42qsw\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338304 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42r4w\" (UniqueName: \"kubernetes.io/projected/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-kube-api-access-42r4w\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338337 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-trusted-ca\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338716 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-metrics-certs\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.338803 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c085ae2a-3f96-435a-9104-64d8db170233-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.339395 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bebd5-fb32-4e26-8a34-2efba789ef00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.339491 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c106f56-cda7-41ea-9c9b-9b797ccf621b-trusted-ca\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.339593 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc868\" (UniqueName: \"kubernetes.io/projected/7a1bebd5-fb32-4e26-8a34-2efba789ef00-kube-api-access-kc868\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.346723 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.348986 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353014 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353146 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed757b20-1bd6-4a44-a75a-f1698623d397-config\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353204 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353253 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-srv-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353282 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed757b20-1bd6-4a44-a75a-f1698623d397-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353373 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flzsh\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-kube-api-access-flzsh\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353430 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-default-certificate\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353453 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9157b726-b8a9-497e-b833-8612e28868d1-proxy-tls\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353495 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-config\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353521 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353543 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rx2\" (UniqueName: \"kubernetes.io/projected/28399927-5270-4b93-ae66-5c08a318b376-kube-api-access-55rx2\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.353567 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.455802 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" podStartSLOduration=118.455778871 podStartE2EDuration="1m58.455778871s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:08.455547273 +0000 UTC m=+141.717702379" watchObservedRunningTime="2025-10-03 00:09:08.455778871 +0000 UTC m=+141.717933977" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.456232 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:08.956213444 +0000 UTC m=+142.218368550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.456160 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457174 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flzsh\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-kube-api-access-flzsh\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457205 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prk2p\" (UniqueName: \"kubernetes.io/projected/476e6771-16a7-4aea-bbed-7fce15ec230d-kube-api-access-prk2p\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457237 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-apiservice-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457254 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-registration-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457273 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-csi-data-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457291 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-default-certificate\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457308 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9157b726-b8a9-497e-b833-8612e28868d1-proxy-tls\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457323 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6dba70f2-c60f-4d7a-a288-8597b4f95145-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457360 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457394 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-config\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457414 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457430 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-plugins-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457448 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55rx2\" (UniqueName: \"kubernetes.io/projected/28399927-5270-4b93-ae66-5c08a318b376-kube-api-access-55rx2\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457465 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457483 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457510 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457540 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457560 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445ce5e-0123-4888-a603-9802c94e9c09-serving-cert\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457575 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed757b20-1bd6-4a44-a75a-f1698623d397-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457594 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-client\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457617 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457641 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457666 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457685 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-config\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457701 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhsp\" (UniqueName: \"kubernetes.io/projected/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-kube-api-access-4rhsp\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457721 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-stats-auth\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457740 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxzh\" (UniqueName: \"kubernetes.io/projected/127e79f7-9345-4611-b7eb-daf32a27f413-kube-api-access-pnxzh\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457777 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457793 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzfxt\" (UniqueName: \"kubernetes.io/projected/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-kube-api-access-nzfxt\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457812 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-encryption-config\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457831 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8315e005-d902-4564-a12a-83dfe30643f2-proxy-tls\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457854 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457873 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457892 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-config\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457909 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7nk\" (UniqueName: \"kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457945 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gxmq\" (UniqueName: \"kubernetes.io/projected/8e92b02b-2ba0-4d57-b018-5592e79a239e-kube-api-access-2gxmq\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457960 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-audit-policies\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457981 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqc6m\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.457999 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-service-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458016 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bebd5-fb32-4e26-8a34-2efba789ef00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458033 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-key\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458054 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhwfw\" (UniqueName: \"kubernetes.io/projected/ed946177-7292-4d1e-8d89-a7e383701224-kube-api-access-mhwfw\") pod \"migrator-59844c95c7-wml5t\" (UID: \"ed946177-7292-4d1e-8d89-a7e383701224\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458072 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ae2a-3f96-435a-9104-64d8db170233-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458087 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/27f98a08-f7d9-4057-a44c-30c88bfea40a-tmpfs\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458104 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-serving-cert\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458121 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c106f56-cda7-41ea-9c9b-9b797ccf621b-metrics-tls\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458140 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b969c27-e4b6-4cde-8263-b2d879764809-audit-dir\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458157 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b771c8-c171-49ac-a805-c13f3fc308e5-config\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458172 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fw4q\" (UniqueName: \"kubernetes.io/projected/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-kube-api-access-4fw4q\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458188 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-serving-cert\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458203 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458219 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-mountpoint-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458238 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8m9z\" (UniqueName: \"kubernetes.io/projected/8445ce5e-0123-4888-a603-9802c94e9c09-kube-api-access-c8m9z\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458258 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbzn\" (UniqueName: \"kubernetes.io/projected/9157b726-b8a9-497e-b833-8612e28868d1-kube-api-access-tlbzn\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458292 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-webhook-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458314 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjzk\" (UniqueName: \"kubernetes.io/projected/c085ae2a-3f96-435a-9104-64d8db170233-kube-api-access-fnjzk\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458330 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9157b726-b8a9-497e-b833-8612e28868d1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458360 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/476e6771-16a7-4aea-bbed-7fce15ec230d-config-volume\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458374 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-client\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458402 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b771c8-c171-49ac-a805-c13f3fc308e5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458419 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28399927-5270-4b93-ae66-5c08a318b376-service-ca-bundle\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458441 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458457 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458475 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-node-bootstrap-token\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458514 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49b771c8-c171-49ac-a805-c13f3fc308e5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458539 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-images\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458558 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7q2c\" (UniqueName: \"kubernetes.io/projected/8315e005-d902-4564-a12a-83dfe30643f2-kube-api-access-j7q2c\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458588 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458606 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42qsw\" (UniqueName: \"kubernetes.io/projected/19441502-65d7-49aa-adc0-1689bbecd75e-kube-api-access-42qsw\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458623 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6tj\" (UniqueName: \"kubernetes.io/projected/6dba70f2-c60f-4d7a-a288-8597b4f95145-kube-api-access-kk6tj\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458642 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42r4w\" (UniqueName: \"kubernetes.io/projected/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-kube-api-access-42r4w\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458658 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-trusted-ca\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458677 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-metrics-certs\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458693 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c085ae2a-3f96-435a-9104-64d8db170233-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458711 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bebd5-fb32-4e26-8a34-2efba789ef00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458730 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c106f56-cda7-41ea-9c9b-9b797ccf621b-trusted-ca\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458745 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458787 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458808 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc868\" (UniqueName: \"kubernetes.io/projected/7a1bebd5-fb32-4e26-8a34-2efba789ef00-kube-api-access-kc868\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458826 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-serving-cert\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458845 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458862 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458891 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458908 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458925 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-cabundle\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458942 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-certs\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458979 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.458996 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-cert\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459011 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-srv-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459032 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/476e6771-16a7-4aea-bbed-7fce15ec230d-metrics-tls\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459050 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fws9m\" (UniqueName: \"kubernetes.io/projected/27f98a08-f7d9-4057-a44c-30c88bfea40a-kube-api-access-fws9m\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459076 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vdx\" (UniqueName: \"kubernetes.io/projected/7b969c27-e4b6-4cde-8263-b2d879764809-kube-api-access-b2vdx\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459093 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-serving-cert\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459111 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mxf\" (UniqueName: \"kubernetes.io/projected/8206a415-c51d-4ed4-bf97-1c00f05810f4-kube-api-access-x5mxf\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459128 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed757b20-1bd6-4a44-a75a-f1698623d397-config\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459144 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459162 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-srv-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459183 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dh6\" (UniqueName: \"kubernetes.io/projected/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-kube-api-access-s5dh6\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459209 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed757b20-1bd6-4a44-a75a-f1698623d397-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459225 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh4gc\" (UniqueName: \"kubernetes.io/projected/7e194a07-871a-4465-a851-64890b30b704-kube-api-access-mh4gc\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.459247 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-socket-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.470474 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9157b726-b8a9-497e-b833-8612e28868d1-proxy-tls\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.471405 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.472681 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-default-certificate\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.475073 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-config\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.480513 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.481477 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445ce5e-0123-4888-a603-9802c94e9c09-serving-cert\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.482155 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c085ae2a-3f96-435a-9104-64d8db170233-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.482316 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bebd5-fb32-4e26-8a34-2efba789ef00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.482610 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.482716 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-service-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.483017 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c106f56-cda7-41ea-9c9b-9b797ccf621b-trusted-ca\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.483622 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.484079 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-client\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.484094 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed757b20-1bd6-4a44-a75a-f1698623d397-config\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.484582 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-trusted-ca\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.484639 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.485157 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49b771c8-c171-49ac-a805-c13f3fc308e5-config\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.488391 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9157b726-b8a9-497e-b833-8612e28868d1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.489037 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28399927-5270-4b93-ae66-5c08a318b376-service-ca-bundle\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.489295 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:08.989278897 +0000 UTC m=+142.251434003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.489769 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/127e79f7-9345-4611-b7eb-daf32a27f413-etcd-ca\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.490342 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8445ce5e-0123-4888-a603-9802c94e9c09-config\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.492255 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8315e005-d902-4564-a12a-83dfe30643f2-images\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.514735 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.515056 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.517039 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed757b20-1bd6-4a44-a75a-f1698623d397-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.520105 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-metrics-certs\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.520267 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.520515 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8315e005-d902-4564-a12a-83dfe30643f2-proxy-tls\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.520848 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.521267 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49b771c8-c171-49ac-a805-c13f3fc308e5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.522163 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/127e79f7-9345-4611-b7eb-daf32a27f413-serving-cert\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.526199 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19441502-65d7-49aa-adc0-1689bbecd75e-srv-cert\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.526870 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.527006 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flzsh\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-kube-api-access-flzsh\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.527429 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c106f56-cda7-41ea-9c9b-9b797ccf621b-metrics-tls\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.528164 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28399927-5270-4b93-ae66-5c08a318b376-stats-auth\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.530274 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c085ae2a-3f96-435a-9104-64d8db170233-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.533059 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c106f56-cda7-41ea-9c9b-9b797ccf621b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9g4wk\" (UID: \"1c106f56-cda7-41ea-9c9b-9b797ccf621b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.551456 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55rx2\" (UniqueName: \"kubernetes.io/projected/28399927-5270-4b93-ae66-5c08a318b376-kube-api-access-55rx2\") pod \"router-default-5444994796-w8cn7\" (UID: \"28399927-5270-4b93-ae66-5c08a318b376\") " pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.553123 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bebd5-fb32-4e26-8a34-2efba789ef00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.554098 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.561820 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/476e6771-16a7-4aea-bbed-7fce15ec230d-config-volume\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562027 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-client\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562069 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-node-bootstrap-token\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562128 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6tj\" (UniqueName: \"kubernetes.io/projected/6dba70f2-c60f-4d7a-a288-8597b4f95145-kube-api-access-kk6tj\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562157 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562177 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562203 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-serving-cert\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562219 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562238 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562260 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-certs\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562278 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-cabundle\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562302 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-srv-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562321 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-cert\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562338 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fws9m\" (UniqueName: \"kubernetes.io/projected/27f98a08-f7d9-4057-a44c-30c88bfea40a-kube-api-access-fws9m\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562356 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/476e6771-16a7-4aea-bbed-7fce15ec230d-metrics-tls\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562377 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vdx\" (UniqueName: \"kubernetes.io/projected/7b969c27-e4b6-4cde-8263-b2d879764809-kube-api-access-b2vdx\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562430 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mxf\" (UniqueName: \"kubernetes.io/projected/8206a415-c51d-4ed4-bf97-1c00f05810f4-kube-api-access-x5mxf\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562447 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-serving-cert\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562478 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dh6\" (UniqueName: \"kubernetes.io/projected/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-kube-api-access-s5dh6\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562497 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-socket-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562517 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh4gc\" (UniqueName: \"kubernetes.io/projected/7e194a07-871a-4465-a851-64890b30b704-kube-api-access-mh4gc\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562539 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prk2p\" (UniqueName: \"kubernetes.io/projected/476e6771-16a7-4aea-bbed-7fce15ec230d-kube-api-access-prk2p\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562559 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-apiservice-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562576 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-registration-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562595 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-csi-data-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562614 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6dba70f2-c60f-4d7a-a288-8597b4f95145-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562634 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562653 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-plugins-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562672 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562699 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562719 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562737 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562780 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-config\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562799 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhsp\" (UniqueName: \"kubernetes.io/projected/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-kube-api-access-4rhsp\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562823 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzfxt\" (UniqueName: \"kubernetes.io/projected/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-kube-api-access-nzfxt\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562845 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-encryption-config\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562863 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562879 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562897 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7nk\" (UniqueName: \"kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562931 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gxmq\" (UniqueName: \"kubernetes.io/projected/8e92b02b-2ba0-4d57-b018-5592e79a239e-kube-api-access-2gxmq\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562948 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-audit-policies\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.562966 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-key\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563000 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/27f98a08-f7d9-4057-a44c-30c88bfea40a-tmpfs\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563021 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b969c27-e4b6-4cde-8263-b2d879764809-audit-dir\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563038 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fw4q\" (UniqueName: \"kubernetes.io/projected/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-kube-api-access-4fw4q\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563056 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-mountpoint-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563076 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-serving-cert\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563092 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.563125 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-webhook-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.581541 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.081515078 +0000 UTC m=+142.343670184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.582178 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/476e6771-16a7-4aea-bbed-7fce15ec230d-config-volume\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.584411 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-webhook-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.584517 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-csi-data-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.585072 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed757b20-1bd6-4a44-a75a-f1698623d397-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djlqs\" (UID: \"ed757b20-1bd6-4a44-a75a-f1698623d397\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.586583 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.588572 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/27f98a08-f7d9-4057-a44c-30c88bfea40a-tmpfs\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.589630 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-node-bootstrap-token\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.591017 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-plugins-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.594234 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.596643 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-mountpoint-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.596723 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b969c27-e4b6-4cde-8263-b2d879764809-audit-dir\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.597219 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.597716 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-audit-policies\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.615671 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-config\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.619269 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.622766 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-registration-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.622858 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.623161 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-client\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.623384 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-znmk2"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.623992 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-encryption-config\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.624100 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b969c27-e4b6-4cde-8263-b2d879764809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.625170 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b969c27-e4b6-4cde-8263-b2d879764809-serving-cert\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.625572 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e92b02b-2ba0-4d57-b018-5592e79a239e-socket-dir\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.626404 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-cabundle\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.627060 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42qsw\" (UniqueName: \"kubernetes.io/projected/19441502-65d7-49aa-adc0-1689bbecd75e-kube-api-access-42qsw\") pod \"catalog-operator-68c6474976-7xrp7\" (UID: \"19441502-65d7-49aa-adc0-1689bbecd75e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.627387 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.627739 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.629426 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-signing-key\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.630198 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.630875 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e194a07-871a-4465-a851-64890b30b704-certs\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.631128 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/476e6771-16a7-4aea-bbed-7fce15ec230d-metrics-tls\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.631678 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-serving-cert\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.631724 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6dba70f2-c60f-4d7a-a288-8597b4f95145-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.631743 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.632023 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.632045 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-serving-cert\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.633348 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27f98a08-f7d9-4057-a44c-30c88bfea40a-apiservice-cert\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.633774 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-cert\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.646747 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8206a415-c51d-4ed4-bf97-1c00f05810f4-srv-cert\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.657449 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.658001 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc868\" (UniqueName: \"kubernetes.io/projected/7a1bebd5-fb32-4e26-8a34-2efba789ef00-kube-api-access-kc868\") pod \"kube-storage-version-migrator-operator-b67b599dd-4gq5l\" (UID: \"7a1bebd5-fb32-4e26-8a34-2efba789ef00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.661525 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42r4w\" (UniqueName: \"kubernetes.io/projected/9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8-kube-api-access-42r4w\") pod \"package-server-manager-789f6589d5-l5pfg\" (UID: \"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.664400 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.664835 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhwfw\" (UniqueName: \"kubernetes.io/projected/ed946177-7292-4d1e-8d89-a7e383701224-kube-api-access-mhwfw\") pod \"migrator-59844c95c7-wml5t\" (UID: \"ed946177-7292-4d1e-8d89-a7e383701224\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.670346 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.170321773 +0000 UTC m=+142.432476879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.688946 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.690450 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqc6m\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.705443 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8m9z\" (UniqueName: \"kubernetes.io/projected/8445ce5e-0123-4888-a603-9802c94e9c09-kube-api-access-c8m9z\") pod \"console-operator-58897d9998-r4wx4\" (UID: \"8445ce5e-0123-4888-a603-9802c94e9c09\") " pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: W1003 00:09:08.707909 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73e2dc6_6174_4a76_a7d2_192ef93b7935.slice/crio-65be413f94ee9948d954b529d5d29bfe0983a567e1060e1c32310b8df86efda2 WatchSource:0}: Error finding container 65be413f94ee9948d954b529d5d29bfe0983a567e1060e1c32310b8df86efda2: Status 404 returned error can't find the container with id 65be413f94ee9948d954b529d5d29bfe0983a567e1060e1c32310b8df86efda2 Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.722616 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbzn\" (UniqueName: \"kubernetes.io/projected/9157b726-b8a9-497e-b833-8612e28868d1-kube-api-access-tlbzn\") pod \"machine-config-controller-84d6567774-6v588\" (UID: \"9157b726-b8a9-497e-b833-8612e28868d1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.738472 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjzk\" (UniqueName: \"kubernetes.io/projected/c085ae2a-3f96-435a-9104-64d8db170233-kube-api-access-fnjzk\") pod \"openshift-controller-manager-operator-756b6f6bc6-g2phc\" (UID: \"c085ae2a-3f96-435a-9104-64d8db170233\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.751504 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.752716 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.753432 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxzh\" (UniqueName: \"kubernetes.io/projected/127e79f7-9345-4611-b7eb-daf32a27f413-kube-api-access-pnxzh\") pod \"etcd-operator-b45778765-zg4z4\" (UID: \"127e79f7-9345-4611-b7eb-daf32a27f413\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.771071 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.771269 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.271231456 +0000 UTC m=+142.533386582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.771380 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7q2c\" (UniqueName: \"kubernetes.io/projected/8315e005-d902-4564-a12a-83dfe30643f2-kube-api-access-j7q2c\") pod \"machine-config-operator-74547568cd-8pn2f\" (UID: \"8315e005-d902-4564-a12a-83dfe30643f2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.771533 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.772163 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.272128276 +0000 UTC m=+142.534305943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.780934 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49b771c8-c171-49ac-a805-c13f3fc308e5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tv5hb\" (UID: \"49b771c8-c171-49ac-a805-c13f3fc308e5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.792274 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.795855 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.809324 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.812022 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.813452 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: W1003 00:09:08.818542 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8a5bbc_dc0e_4b7b_b571_83b7be0abe64.slice/crio-b312ff38ddd42529c6b71ee4109fe00dbd38cf6d7e5bc95d292b8141da8603a4 WatchSource:0}: Error finding container b312ff38ddd42529c6b71ee4109fe00dbd38cf6d7e5bc95d292b8141da8603a4: Status 404 returned error can't find the container with id b312ff38ddd42529c6b71ee4109fe00dbd38cf6d7e5bc95d292b8141da8603a4 Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.828807 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.851473 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.851835 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fws9m\" (UniqueName: \"kubernetes.io/projected/27f98a08-f7d9-4057-a44c-30c88bfea40a-kube-api-access-fws9m\") pod \"packageserver-d55dfcdfc-hwfjz\" (UID: \"27f98a08-f7d9-4057-a44c-30c88bfea40a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.858729 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6tj\" (UniqueName: \"kubernetes.io/projected/6dba70f2-c60f-4d7a-a288-8597b4f95145-kube-api-access-kk6tj\") pod \"multus-admission-controller-857f4d67dd-xtqlf\" (UID: \"6dba70f2-c60f-4d7a-a288-8597b4f95145\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.873042 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.873719 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.37369629 +0000 UTC m=+142.635851396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.875042 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.882155 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5fz8\" (UID: \"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.899024 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.917802 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj\") pod \"marketplace-operator-79b997595-wbjmp\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.917973 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.926701 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fw4q\" (UniqueName: \"kubernetes.io/projected/ff019d52-fc68-49e4-a096-b0f8d5b3cedf-kube-api-access-4fw4q\") pod \"service-ca-9c57cc56f-m9qgv\" (UID: \"ff019d52-fc68-49e4-a096-b0f8d5b3cedf\") " pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.947651 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.948721 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gxmq\" (UniqueName: \"kubernetes.io/projected/8e92b02b-2ba0-4d57-b018-5592e79a239e-kube-api-access-2gxmq\") pod \"csi-hostpathplugin-g59tb\" (UID: \"8e92b02b-2ba0-4d57-b018-5592e79a239e\") " pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.958374 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.969316 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7nk\" (UniqueName: \"kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk\") pod \"collect-profiles-29324160-nmbdb\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.979980 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:08 crc kubenswrapper[4832]: E1003 00:09:08.980437 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.480419958 +0000 UTC m=+142.742575064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:08 crc kubenswrapper[4832]: I1003 00:09:08.984026 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhsp\" (UniqueName: \"kubernetes.io/projected/2573fb7d-f41d-4f5d-a2da-b9c276c817f0-kube-api-access-4rhsp\") pod \"openshift-config-operator-7777fb866f-zzj42\" (UID: \"2573fb7d-f41d-4f5d-a2da-b9c276c817f0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.004336 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.019320 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzfxt\" (UniqueName: \"kubernetes.io/projected/cf2d8832-8f75-4a49-9f0e-18ea211acb4c-kube-api-access-nzfxt\") pod \"service-ca-operator-777779d784-nmggh\" (UID: \"cf2d8832-8f75-4a49-9f0e-18ea211acb4c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.019646 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.031994 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.047656 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.057482 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mxf\" (UniqueName: \"kubernetes.io/projected/8206a415-c51d-4ed4-bf97-1c00f05810f4-kube-api-access-x5mxf\") pod \"olm-operator-6b444d44fb-lw2b2\" (UID: \"8206a415-c51d-4ed4-bf97-1c00f05810f4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.062370 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.063220 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.071643 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.072166 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vdx\" (UniqueName: \"kubernetes.io/projected/7b969c27-e4b6-4cde-8263-b2d879764809-kube-api-access-b2vdx\") pod \"apiserver-7bbb656c7d-jfkwz\" (UID: \"7b969c27-e4b6-4cde-8263-b2d879764809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.082699 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.083142 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.583123222 +0000 UTC m=+142.845278328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.083328 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.096574 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.105829 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dh6\" (UniqueName: \"kubernetes.io/projected/078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc-kube-api-access-s5dh6\") pod \"ingress-canary-qcrcz\" (UID: \"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc\") " pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.120225 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qcrcz" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.125276 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prk2p\" (UniqueName: \"kubernetes.io/projected/476e6771-16a7-4aea-bbed-7fce15ec230d-kube-api-access-prk2p\") pod \"dns-default-svzx5\" (UID: \"476e6771-16a7-4aea-bbed-7fce15ec230d\") " pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.133766 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vgm2t"] Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.134073 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.168358 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh4gc\" (UniqueName: \"kubernetes.io/projected/7e194a07-871a-4465-a851-64890b30b704-kube-api-access-mh4gc\") pod \"machine-config-server-g4wns\" (UID: \"7e194a07-871a-4465-a851-64890b30b704\") " pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.170101 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.186352 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.187177 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.687159319 +0000 UTC m=+142.949314425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.224162 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v"] Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.269657 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8cn7" event={"ID":"28399927-5270-4b93-ae66-5c08a318b376","Type":"ContainerStarted","Data":"aa1f80a490d8b23804de88315d0575278c6d14fc09a9de869199676e623a9448"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.269711 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8cn7" event={"ID":"28399927-5270-4b93-ae66-5c08a318b376","Type":"ContainerStarted","Data":"44ca2d211f32cd870d5d4a0325bffc0b7a1714cdf4a1c4e077e0b1d75f3cc222"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.271094 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" event={"ID":"7367a27e-1830-4f44-b9c2-543541bbb794","Type":"ContainerStarted","Data":"35324ddb13cdde724f82b0ae8fc43a19dc16458d5a24624ddf2f4aa37c732a04"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.272117 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" event={"ID":"17d5f7b6-d924-4095-a636-4f283d40d2e8","Type":"ContainerStarted","Data":"3ac9091927234dc1826a69b81cff8d51626e3208459378e537e381d74bd2613a"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.272142 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" event={"ID":"17d5f7b6-d924-4095-a636-4f283d40d2e8","Type":"ContainerStarted","Data":"0965344f234f46f610da7912ad4d183a83f0ba8be0cdb81130277171eb2045fa"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.281264 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kjfxn" podStartSLOduration=119.281245752 podStartE2EDuration="1m59.281245752s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:09.239116356 +0000 UTC m=+142.501271462" watchObservedRunningTime="2025-10-03 00:09:09.281245752 +0000 UTC m=+142.543400858" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.282054 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-znmk2" event={"ID":"c73e2dc6-6174-4a76-a7d2-192ef93b7935","Type":"ContainerStarted","Data":"65be413f94ee9948d954b529d5d29bfe0983a567e1060e1c32310b8df86efda2"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.291050 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.293311 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.793294907 +0000 UTC m=+143.055450013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.304987 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29324160-vrdgr" event={"ID":"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe","Type":"ContainerStarted","Data":"9b97b4bc4c5797363fcf9c6b533a58746dc4bd1bd30b05196ed45e9bc9a5ed51"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.305044 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29324160-vrdgr" event={"ID":"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe","Type":"ContainerStarted","Data":"079476f1b4f6b1ce0eebb29031c65aef31d18ec1803bf0dffdd295a8e2ab1466"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.324645 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" event={"ID":"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64","Type":"ContainerStarted","Data":"b312ff38ddd42529c6b71ee4109fe00dbd38cf6d7e5bc95d292b8141da8603a4"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.325884 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk"] Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.326653 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" event={"ID":"e422e2ec-57ff-4009-8065-bfa489b34334","Type":"ContainerStarted","Data":"c9c0df18f5f57b232b73dd36a46dc439ebf0e31fcae0b2908750ca6fa19887a2"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.341849 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hxv6p" event={"ID":"a3407ec5-4299-4c9d-9b37-64448de352b7","Type":"ContainerStarted","Data":"d32886054d6035f3aa2fa3509ad912814e2705fdd1784d23c026a49539f53feb"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.341907 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hxv6p" event={"ID":"a3407ec5-4299-4c9d-9b37-64448de352b7","Type":"ContainerStarted","Data":"f1a5bff76e57701a968bcaf1b56ecdbe07d44b18b2bb3ac53e75af48560690a6"} Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.402912 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.405544 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:09.90552302 +0000 UTC m=+143.167678126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.411485 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g4wns" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.456766 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ps468" podStartSLOduration=119.456728841 podStartE2EDuration="1m59.456728841s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:09.454434514 +0000 UTC m=+142.716589630" watchObservedRunningTime="2025-10-03 00:09:09.456728841 +0000 UTC m=+142.718883947" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.504604 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.504832 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.004798708 +0000 UTC m=+143.266953814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.505108 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.505571 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.005551503 +0000 UTC m=+143.267706619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.555883 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.607441 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.607778 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.107748318 +0000 UTC m=+143.369903424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.690422 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.690796 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.710326 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.710647 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.210634017 +0000 UTC m=+143.472789123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.819396 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.819850 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.319821868 +0000 UTC m=+143.581976974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.873257 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" podStartSLOduration=118.873235744 podStartE2EDuration="1m58.873235744s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:09.835460944 +0000 UTC m=+143.097616050" watchObservedRunningTime="2025-10-03 00:09:09.873235744 +0000 UTC m=+143.135390850" Oct 03 00:09:09 crc kubenswrapper[4832]: I1003 00:09:09.924587 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:09 crc kubenswrapper[4832]: E1003 00:09:09.925325 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.425312305 +0000 UTC m=+143.687467411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.026480 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.026744 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.526707543 +0000 UTC m=+143.788862649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.026961 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.027376 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.527365366 +0000 UTC m=+143.789520472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.056416 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb"] Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.129501 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.130069 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.630032478 +0000 UTC m=+143.892187584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.233714 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.234101 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.734088455 +0000 UTC m=+143.996243561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.335183 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.335999 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.835980841 +0000 UTC m=+144.098135947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.357204 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-v4h7f" podStartSLOduration=119.357181343 podStartE2EDuration="1m59.357181343s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.356892295 +0000 UTC m=+143.619047401" watchObservedRunningTime="2025-10-03 00:09:10.357181343 +0000 UTC m=+143.619336449" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.378632 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g4wns" event={"ID":"7e194a07-871a-4465-a851-64890b30b704","Type":"ContainerStarted","Data":"fcfcf057fb658175126b5331d4f5e30f3444024fdeeef85f940c9d4ca9bb7717"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.381571 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" event={"ID":"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64","Type":"ContainerStarted","Data":"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.386145 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.388478 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" event={"ID":"ca8202c0-14e2-4221-9cdb-e5439ba59058","Type":"ContainerStarted","Data":"8384281babb14b288ad4845b4159576250dba3be38cfbf760464dc0b73f14d05"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.390344 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" event={"ID":"e422e2ec-57ff-4009-8065-bfa489b34334","Type":"ContainerStarted","Data":"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.392079 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.417849 4832 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hqmpt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.417926 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.424987 4832 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sl4zz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.425055 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.431634 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-znmk2" event={"ID":"c73e2dc6-6174-4a76-a7d2-192ef93b7935","Type":"ContainerStarted","Data":"9cc0e95665afcf2fa9e9922bf3a8acdf145f60f67a87ec120fc5192cd3a63cae"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.432523 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.441142 4832 patch_prober.go:28] interesting pod/downloads-7954f5f757-znmk2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.441224 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-znmk2" podUID="c73e2dc6-6174-4a76-a7d2-192ef93b7935" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.442557 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.444172 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:10.944150557 +0000 UTC m=+144.206305883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.456722 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" event={"ID":"1c106f56-cda7-41ea-9c9b-9b797ccf621b","Type":"ContainerStarted","Data":"f698f9d474268163d1397f638d5cb3f9310a9f4d73fddd4744592c1d1120a2b5"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.459722 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" event={"ID":"7367a27e-1830-4f44-b9c2-543541bbb794","Type":"ContainerStarted","Data":"5e9987c22155be4a0b9fe0473cb78b8470e90b0626485175e20b5566006a181e"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.470403 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" event={"ID":"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8","Type":"ContainerStarted","Data":"0c2b063466f44b2059017649bd4f93c1a3157fd53cd894bdaf3f684ac0cec4cb"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.473814 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" event={"ID":"49b771c8-c171-49ac-a805-c13f3fc308e5","Type":"ContainerStarted","Data":"052ea1ba4254768164f42cf1326f626392402a39fcfd4756811cf4698bc79be9"} Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.543837 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.544349 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.044262104 +0000 UTC m=+144.306417220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.544932 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.548544 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.048530837 +0000 UTC m=+144.310686153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.634528 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29324160-vrdgr" podStartSLOduration=120.634497457 podStartE2EDuration="2m0.634497457s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.61672802 +0000 UTC m=+143.878883126" watchObservedRunningTime="2025-10-03 00:09:10.634497457 +0000 UTC m=+143.896652563" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.652857 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.653354 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.153331461 +0000 UTC m=+144.415486567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.659523 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-znmk2" podStartSLOduration=120.659491477 podStartE2EDuration="2m0.659491477s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.652743041 +0000 UTC m=+143.914898147" watchObservedRunningTime="2025-10-03 00:09:10.659491477 +0000 UTC m=+143.921646583" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.692115 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" podStartSLOduration=120.692090373 podStartE2EDuration="2m0.692090373s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.690142617 +0000 UTC m=+143.952297723" watchObservedRunningTime="2025-10-03 00:09:10.692090373 +0000 UTC m=+143.954245479" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.719553 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:10 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:10 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:10 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.719622 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.752076 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" podStartSLOduration=119.752053029 podStartE2EDuration="1m59.752053029s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.734434857 +0000 UTC m=+143.996589973" watchObservedRunningTime="2025-10-03 00:09:10.752053029 +0000 UTC m=+144.014208135" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.760918 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.761246 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.261233118 +0000 UTC m=+144.523388224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.866723 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.867221 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.36719818 +0000 UTC m=+144.629353286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.870495 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hxv6p" podStartSLOduration=120.87046529 podStartE2EDuration="2m0.87046529s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.785734821 +0000 UTC m=+144.047889927" watchObservedRunningTime="2025-10-03 00:09:10.87046529 +0000 UTC m=+144.132620396" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.871098 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w8cn7" podStartSLOduration=119.871092281 podStartE2EDuration="1m59.871092281s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.867893934 +0000 UTC m=+144.130049040" watchObservedRunningTime="2025-10-03 00:09:10.871092281 +0000 UTC m=+144.133247387" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.926124 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xjqmz" podStartSLOduration=119.92610006 podStartE2EDuration="1m59.92610006s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.924518327 +0000 UTC m=+144.186673433" watchObservedRunningTime="2025-10-03 00:09:10.92610006 +0000 UTC m=+144.188255166" Oct 03 00:09:10 crc kubenswrapper[4832]: I1003 00:09:10.969610 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:10 crc kubenswrapper[4832]: E1003 00:09:10.970151 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.47013271 +0000 UTC m=+144.732287816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.071548 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.071998 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.571959654 +0000 UTC m=+144.834114760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.072240 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.072586 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.572571035 +0000 UTC m=+144.834726141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.098960 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.099019 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.114261 4832 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qpxvt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]log ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]etcd ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/max-in-flight-filter ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 03 00:09:11 crc kubenswrapper[4832]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 03 00:09:11 crc kubenswrapper[4832]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/project.openshift.io-projectcache ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/openshift.io-startinformers ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 03 00:09:11 crc kubenswrapper[4832]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 00:09:11 crc kubenswrapper[4832]: livez check failed Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.114317 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" podUID="a23d32e8-79df-45c1-8221-f714cbfb4a15" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.174500 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.174833 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.674728879 +0000 UTC m=+144.936884145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.175195 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.175885 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.675869287 +0000 UTC m=+144.938024393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.280319 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.280655 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.78063584 +0000 UTC m=+145.042790946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.306151 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" podStartSLOduration=120.306130686 podStartE2EDuration="2m0.306130686s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:10.963812579 +0000 UTC m=+144.225967685" watchObservedRunningTime="2025-10-03 00:09:11.306130686 +0000 UTC m=+144.568285792" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.306286 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.382236 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.382972 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.88296074 +0000 UTC m=+145.145115846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.409661 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.416062 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r4wx4"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.433225 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zg4z4"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.443560 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.476380 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6v588"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.484690 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.485131 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:11.985113794 +0000 UTC m=+145.247268900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.526055 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" event={"ID":"17d5f7b6-d924-4095-a636-4f283d40d2e8","Type":"ContainerStarted","Data":"63f3065f352f9b14b3f830d13e522f16229ad120db46e9f5c74c6a5a3c78cdf4"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.539975 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g4wns" event={"ID":"7e194a07-871a-4465-a851-64890b30b704","Type":"ContainerStarted","Data":"904c4d693b6e5c99e30ef31b4fe2112592b86dc18573ba742ffd81ea1c1e053e"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.558737 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bbh52" podStartSLOduration=121.558716948 podStartE2EDuration="2m1.558716948s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:11.55727053 +0000 UTC m=+144.819425636" watchObservedRunningTime="2025-10-03 00:09:11.558716948 +0000 UTC m=+144.820872054" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.560046 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" event={"ID":"ca8202c0-14e2-4221-9cdb-e5439ba59058","Type":"ContainerStarted","Data":"1e971e378a777ff6fcf022525a8c3d2e709a78d589e2c9e3e0875771cabcb506"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.560087 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" event={"ID":"ca8202c0-14e2-4221-9cdb-e5439ba59058","Type":"ContainerStarted","Data":"04cf2e6daf9e46e267c789cc4ab7f5ef60b1feeace7271910090fbe2d0180c4a"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.562522 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:11 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:11 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:11 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.562681 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.569525 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" event={"ID":"49b771c8-c171-49ac-a805-c13f3fc308e5","Type":"ContainerStarted","Data":"aeb4eb55b490c11e2364b57f6d07937bac0c86699cf5291b6177d4de608b5c62"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.582859 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g4wns" podStartSLOduration=6.582827769 podStartE2EDuration="6.582827769s" podCreationTimestamp="2025-10-03 00:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:11.582472527 +0000 UTC m=+144.844627653" watchObservedRunningTime="2025-10-03 00:09:11.582827769 +0000 UTC m=+144.844982865" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.588615 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.619586 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.119568175 +0000 UTC m=+145.381723281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.623309 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.629290 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vgm2t" podStartSLOduration=120.62926987 podStartE2EDuration="2m0.62926987s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:11.626609461 +0000 UTC m=+144.888764577" watchObservedRunningTime="2025-10-03 00:09:11.62926987 +0000 UTC m=+144.891424976" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.632327 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" event={"ID":"1c106f56-cda7-41ea-9c9b-9b797ccf621b","Type":"ContainerStarted","Data":"e0dcbcba120cfb0323fb1dcf52d51d2e7a33e0129ef00d826e4a784a9602f96f"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.632392 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" event={"ID":"1c106f56-cda7-41ea-9c9b-9b797ccf621b","Type":"ContainerStarted","Data":"007d9dd9e88c97bda53dfe13f60c8fbaa0a10013f1beaeaa8d257982e1bdce31"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.673541 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qcrcz"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.675802 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tv5hb" podStartSLOduration=120.675789364 podStartE2EDuration="2m0.675789364s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:11.646187559 +0000 UTC m=+144.908342665" watchObservedRunningTime="2025-10-03 00:09:11.675789364 +0000 UTC m=+144.937944470" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.687239 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9kl2v" event={"ID":"02e3e27d-2b35-47ed-a82f-686e4ef7dfd8","Type":"ContainerStarted","Data":"79cbd8404dcba9ea9fd9650daa4b0291190d18639458eb7854c440e0ca85f7d6"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.688446 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9g4wk" podStartSLOduration=120.688429359 podStartE2EDuration="2m0.688429359s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:11.684021101 +0000 UTC m=+144.946176207" watchObservedRunningTime="2025-10-03 00:09:11.688429359 +0000 UTC m=+144.950584465" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.689214 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.690515 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.190499549 +0000 UTC m=+145.452654655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.715013 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" event={"ID":"8315e005-d902-4564-a12a-83dfe30643f2","Type":"ContainerStarted","Data":"5d9faf5d6e7ed42eac82e51a983f42208d8382aed70ce61f65bce70c43a51da7"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.717330 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" event={"ID":"ed757b20-1bd6-4a44-a75a-f1698623d397","Type":"ContainerStarted","Data":"3dc316c78d54064a2973fa9dcf043b10690fa8eb8f2151ae327c61815f7f4163"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.719237 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" event={"ID":"8445ce5e-0123-4888-a603-9802c94e9c09","Type":"ContainerStarted","Data":"929aa6c6b3c5f8a4b0e97830ecb71eb907b84365280ac49ebbc5a463c3882cad"} Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.720773 4832 patch_prober.go:28] interesting pod/downloads-7954f5f757-znmk2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.720832 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-znmk2" podUID="c73e2dc6-6174-4a76-a7d2-192ef93b7935" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.739993 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.747839 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.749794 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m9qgv"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.787885 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.794080 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.797283 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.297245018 +0000 UTC m=+145.559400124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.837879 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zzj42"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.852950 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.858424 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xtqlf"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.862936 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.868819 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.869517 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t"] Oct 03 00:09:11 crc kubenswrapper[4832]: W1003 00:09:11.879420 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1bebd5_fb32_4e26_8a34_2efba789ef00.slice/crio-094140e750e263b6076553c2b86510b160aa8cee311aa7b1983816b289fa8188 WatchSource:0}: Error finding container 094140e750e263b6076553c2b86510b160aa8cee311aa7b1983816b289fa8188: Status 404 returned error can't find the container with id 094140e750e263b6076553c2b86510b160aa8cee311aa7b1983816b289fa8188 Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.888867 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.892536 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.900224 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.902331 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-svzx5"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.904975 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nmggh"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.905492 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:11 crc kubenswrapper[4832]: E1003 00:09:11.905914 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.40589549 +0000 UTC m=+145.668050596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.917665 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb"] Oct 03 00:09:11 crc kubenswrapper[4832]: I1003 00:09:11.918496 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g59tb"] Oct 03 00:09:11 crc kubenswrapper[4832]: W1003 00:09:11.990402 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1f4f2f_c8de_4020_ac5c_470ca0fd95eb.slice/crio-c8668582e0e250721a4dbb3164c0b506b2d44b7eb4c7d2aa008242029a9cf0c9 WatchSource:0}: Error finding container c8668582e0e250721a4dbb3164c0b506b2d44b7eb4c7d2aa008242029a9cf0c9: Status 404 returned error can't find the container with id c8668582e0e250721a4dbb3164c0b506b2d44b7eb4c7d2aa008242029a9cf0c9 Oct 03 00:09:11 crc kubenswrapper[4832]: W1003 00:09:11.991736 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2573fb7d_f41d_4f5d_a2da_b9c276c817f0.slice/crio-c017f7d2a891f9644eaec7f2c930294f5c66770bd126970856dab02545a5ee8f WatchSource:0}: Error finding container c017f7d2a891f9644eaec7f2c930294f5c66770bd126970856dab02545a5ee8f: Status 404 returned error can't find the container with id c017f7d2a891f9644eaec7f2c930294f5c66770bd126970856dab02545a5ee8f Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.007211 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.007809 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.507777936 +0000 UTC m=+145.769933062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.107698 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.108089 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.608046136 +0000 UTC m=+145.870201252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.108168 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.108928 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.608904376 +0000 UTC m=+145.871059482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: W1003 00:09:12.109968 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac12ff5_1646_45c2_822c_7741b913d805.slice/crio-5591ff63650c4cc3b7aeaaf29ff049fa6f3bdf5ee448704fdcb4d93b5353cc68 WatchSource:0}: Error finding container 5591ff63650c4cc3b7aeaaf29ff049fa6f3bdf5ee448704fdcb4d93b5353cc68: Status 404 returned error can't find the container with id 5591ff63650c4cc3b7aeaaf29ff049fa6f3bdf5ee448704fdcb4d93b5353cc68 Oct 03 00:09:12 crc kubenswrapper[4832]: W1003 00:09:12.122504 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded946177_7292_4d1e_8d89_a7e383701224.slice/crio-5f05c3e400cbeec3cf1481d113a23adab31ab8ceda36eb3a238bedfe9be84346 WatchSource:0}: Error finding container 5f05c3e400cbeec3cf1481d113a23adab31ab8ceda36eb3a238bedfe9be84346: Status 404 returned error can't find the container with id 5f05c3e400cbeec3cf1481d113a23adab31ab8ceda36eb3a238bedfe9be84346 Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.209407 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.209850 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.709835099 +0000 UTC m=+145.971990205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.312215 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.313362 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.813338758 +0000 UTC m=+146.075493864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.414511 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.418624 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.918566846 +0000 UTC m=+146.180721962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.418864 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.419683 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:12.919672483 +0000 UTC m=+146.181827589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.531317 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.531917 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.031897956 +0000 UTC m=+146.294053062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.562003 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:12 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:12 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:12 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.562066 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.632709 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.633155 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.133138549 +0000 UTC m=+146.395293655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.735570 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.735916 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.235885384 +0000 UTC m=+146.498040490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.736088 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.736595 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.236580118 +0000 UTC m=+146.498735224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.745992 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" event={"ID":"6dba70f2-c60f-4d7a-a288-8597b4f95145","Type":"ContainerStarted","Data":"3d1d8110de39ceb0cafbaf9c5c3972042dc3528ad856923283a82d55da55c03f"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.755113 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" event={"ID":"19441502-65d7-49aa-adc0-1689bbecd75e","Type":"ContainerStarted","Data":"7f6350d9f637c3e82bb5f6a3f5d702e2c5703744a66fd4f25cd72a04e49f4069"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.755168 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" event={"ID":"19441502-65d7-49aa-adc0-1689bbecd75e","Type":"ContainerStarted","Data":"a43b840c501012e817a4cc374f584f8bca5090497d6506a613f1ca2955ac853a"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.755765 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.762863 4832 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7xrp7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.762912 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" podUID="19441502-65d7-49aa-adc0-1689bbecd75e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.773272 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" event={"ID":"8206a415-c51d-4ed4-bf97-1c00f05810f4","Type":"ContainerStarted","Data":"629dbec62d39e441854681aeda912a2b5a435a51ad52ce0cfa41acdb88dde4b2"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.790062 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" event={"ID":"ff019d52-fc68-49e4-a096-b0f8d5b3cedf","Type":"ContainerStarted","Data":"e26c857e3f9922458c0f6de82c0fd468428c8b8649bda7c351ead7794b7620b1"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.790128 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" event={"ID":"ff019d52-fc68-49e4-a096-b0f8d5b3cedf","Type":"ContainerStarted","Data":"0d76b1e9287bd9ef7001bfa8bfa44749c24cfe449c5feb06ff006c1e1350949b"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.811084 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" podStartSLOduration=121.811069082 podStartE2EDuration="2m1.811069082s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:12.78486424 +0000 UTC m=+146.047019346" watchObservedRunningTime="2025-10-03 00:09:12.811069082 +0000 UTC m=+146.073224178" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.812583 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m9qgv" podStartSLOduration=121.812576872 podStartE2EDuration="2m1.812576872s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:12.810342307 +0000 UTC m=+146.072497413" watchObservedRunningTime="2025-10-03 00:09:12.812576872 +0000 UTC m=+146.074731978" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.839738 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" event={"ID":"9157b726-b8a9-497e-b833-8612e28868d1","Type":"ContainerStarted","Data":"58cfa91b0c088d3883ac650b7ad0b95489d7f57691759eeffad913143beb3345"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.839816 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" event={"ID":"9157b726-b8a9-497e-b833-8612e28868d1","Type":"ContainerStarted","Data":"fccb2bf58a428d1de41d3d54002bbce0ebb96a43ffd40c94ba2d1119c502818f"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.841584 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.841800 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.341782024 +0000 UTC m=+146.603937120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.843426 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.343410128 +0000 UTC m=+146.605565244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.842317 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.855687 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" event={"ID":"8315e005-d902-4564-a12a-83dfe30643f2","Type":"ContainerStarted","Data":"22c808d35cba301da731a4f5dba15f472c6cf476e1ad98a5707f77596159e53c"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.855771 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" event={"ID":"8315e005-d902-4564-a12a-83dfe30643f2","Type":"ContainerStarted","Data":"5aa4e05eeacc7e829f340f99e3bf0f878a297078a33a6034873605d081e4ac0b"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.865774 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" podStartSLOduration=121.865731349 podStartE2EDuration="2m1.865731349s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:12.863906698 +0000 UTC m=+146.126061804" watchObservedRunningTime="2025-10-03 00:09:12.865731349 +0000 UTC m=+146.127886455" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.879592 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" event={"ID":"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8","Type":"ContainerStarted","Data":"e25fc66c8c54bd57f07abb28a339d3f656ef9bed8ddd27bab6876424ecc77bbc"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.879643 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" event={"ID":"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8","Type":"ContainerStarted","Data":"1f839233f2bea3ba01db203df61b8c158a93a302c454c679f0548e1c6a7a9754"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.889382 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" event={"ID":"8e92b02b-2ba0-4d57-b018-5592e79a239e","Type":"ContainerStarted","Data":"5b6d378c54df3aad319cc32e6c2b7dc130a383a17d5b21dddafdb88aa6e49762"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.913510 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svzx5" event={"ID":"476e6771-16a7-4aea-bbed-7fce15ec230d","Type":"ContainerStarted","Data":"dee27a63d25fb7f9f9213d2f18976d49e16027921d24073c5b0988964e812c0e"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.915876 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" event={"ID":"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb","Type":"ContainerStarted","Data":"c8668582e0e250721a4dbb3164c0b506b2d44b7eb4c7d2aa008242029a9cf0c9"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.918021 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qcrcz" event={"ID":"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc","Type":"ContainerStarted","Data":"d535640a724fe0b5289228a52759b1bf7c077a674224f32141e7e0c101799abd"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.918077 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qcrcz" event={"ID":"078d885e-a1eb-4bf6-8f0f-b5058ee3c2dc","Type":"ContainerStarted","Data":"f07ab43f9bbf13eb2fa7aca99aa680e197d23c529a047f23875ae4e333672a37"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.923361 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" event={"ID":"7a1bebd5-fb32-4e26-8a34-2efba789ef00","Type":"ContainerStarted","Data":"58125c882cd92eedb3a98d7144fdd203180fc37462098b9e1ad8ed7890c885aa"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.923408 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" event={"ID":"7a1bebd5-fb32-4e26-8a34-2efba789ef00","Type":"ContainerStarted","Data":"094140e750e263b6076553c2b86510b160aa8cee311aa7b1983816b289fa8188"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.936976 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" event={"ID":"8445ce5e-0123-4888-a603-9802c94e9c09","Type":"ContainerStarted","Data":"0230cddddf17912620cc304c835a3dd39c26f3a4ac74dae9979ac20bdcb4b0f0"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.938078 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.939226 4832 patch_prober.go:28] interesting pod/console-operator-58897d9998-r4wx4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.939272 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" podUID="8445ce5e-0123-4888-a603-9802c94e9c09" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.945520 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:12 crc kubenswrapper[4832]: E1003 00:09:12.945969 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.445925586 +0000 UTC m=+146.708080692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.956883 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" event={"ID":"e78dfd8f-d109-412c-afcc-2e9fe3786bd3","Type":"ContainerStarted","Data":"f42af82c3e084cc1af6355595344a43e09a34ba7a147ea90d47f225177012fef"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.958705 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" podStartSLOduration=121.958678514 podStartE2EDuration="2m1.958678514s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:12.95648905 +0000 UTC m=+146.218644156" watchObservedRunningTime="2025-10-03 00:09:12.958678514 +0000 UTC m=+146.220833620" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.961737 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8pn2f" podStartSLOduration=121.961726176 podStartE2EDuration="2m1.961726176s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:12.90381259 +0000 UTC m=+146.165967696" watchObservedRunningTime="2025-10-03 00:09:12.961726176 +0000 UTC m=+146.223881292" Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.963242 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" event={"ID":"cf2d8832-8f75-4a49-9f0e-18ea211acb4c","Type":"ContainerStarted","Data":"3af341968b393ad5fb03b02d10b108bc31f614c86925682c923e608621a89df3"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.963294 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" event={"ID":"cf2d8832-8f75-4a49-9f0e-18ea211acb4c","Type":"ContainerStarted","Data":"d07cd2561796d6a2f7b2e1f911499711711b19fdd61820b4f93eacdc94e133be"} Oct 03 00:09:12 crc kubenswrapper[4832]: I1003 00:09:12.984461 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" event={"ID":"7b969c27-e4b6-4cde-8263-b2d879764809","Type":"ContainerStarted","Data":"368266d961c1e61a81b24ebf90f084af081e133e5e64275cb6add8b8a1f8c7cc"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.006483 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" podStartSLOduration=123.00645233 podStartE2EDuration="2m3.00645233s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.005293462 +0000 UTC m=+146.267448568" watchObservedRunningTime="2025-10-03 00:09:13.00645233 +0000 UTC m=+146.268607436" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.016220 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" event={"ID":"27f98a08-f7d9-4057-a44c-30c88bfea40a","Type":"ContainerStarted","Data":"ea42702f4aeda6cdcc24ecbb488a266602faa1547482be3f701d14c0bf852c3b"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.017337 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.030703 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" event={"ID":"ed757b20-1bd6-4a44-a75a-f1698623d397","Type":"ContainerStarted","Data":"bcd109648e46af444d16675de9267592ce74a2f80ef1ea16fa463eb0e3a5cc21"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.034351 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4gq5l" podStartSLOduration=122.034335767 podStartE2EDuration="2m2.034335767s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.031802512 +0000 UTC m=+146.293957608" watchObservedRunningTime="2025-10-03 00:09:13.034335767 +0000 UTC m=+146.296490873" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.037666 4832 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hwfjz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.037709 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" podUID="27f98a08-f7d9-4057-a44c-30c88bfea40a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.047184 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.051535 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.551519955 +0000 UTC m=+146.813675061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.063294 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" event={"ID":"127e79f7-9345-4611-b7eb-daf32a27f413","Type":"ContainerStarted","Data":"8e3a9c8f83a234a02815608209ce45dfdcb0f72083d9f3c52da33332b00e3f98"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.063358 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" event={"ID":"127e79f7-9345-4611-b7eb-daf32a27f413","Type":"ContainerStarted","Data":"acb22b224f3148d8a731457c7428b9e79728ed374f4ce163e6a05c5b668b68d1"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.068065 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qcrcz" podStartSLOduration=8.06803815 podStartE2EDuration="8.06803815s" podCreationTimestamp="2025-10-03 00:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.060134685 +0000 UTC m=+146.322289791" watchObservedRunningTime="2025-10-03 00:09:13.06803815 +0000 UTC m=+146.330193266" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.091924 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djlqs" podStartSLOduration=122.091893463 podStartE2EDuration="2m2.091893463s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.09179324 +0000 UTC m=+146.353948346" watchObservedRunningTime="2025-10-03 00:09:13.091893463 +0000 UTC m=+146.354048559" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.094187 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" event={"ID":"ed946177-7292-4d1e-8d89-a7e383701224","Type":"ContainerStarted","Data":"5f05c3e400cbeec3cf1481d113a23adab31ab8ceda36eb3a238bedfe9be84346"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.100372 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" event={"ID":"c085ae2a-3f96-435a-9104-64d8db170233","Type":"ContainerStarted","Data":"b20baef680c8876df058a47c862047716f99f119dcb0d0850022617806a145a0"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.100441 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" event={"ID":"c085ae2a-3f96-435a-9104-64d8db170233","Type":"ContainerStarted","Data":"b2073c736ca20671d4c9bafb07fac2f7e26912255c51fbc02deaf75d3c16ad7a"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.114336 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" event={"ID":"2573fb7d-f41d-4f5d-a2da-b9c276c817f0","Type":"ContainerStarted","Data":"c017f7d2a891f9644eaec7f2c930294f5c66770bd126970856dab02545a5ee8f"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.123493 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nmggh" podStartSLOduration=122.123471714 podStartE2EDuration="2m2.123471714s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.119523762 +0000 UTC m=+146.381678888" watchObservedRunningTime="2025-10-03 00:09:13.123471714 +0000 UTC m=+146.385626820" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.123773 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" event={"ID":"8ac12ff5-1646-45c2-822c-7741b913d805","Type":"ContainerStarted","Data":"b1ce2fb08cf3bbf936ab7af9d29562d673c5017fcb208edefa3025e6c88581a6"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.123816 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" event={"ID":"8ac12ff5-1646-45c2-822c-7741b913d805","Type":"ContainerStarted","Data":"5591ff63650c4cc3b7aeaaf29ff049fa6f3bdf5ee448704fdcb4d93b5353cc68"} Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.123834 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.171772 4832 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wbjmp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.171856 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.174287 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.178123 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.67807464 +0000 UTC m=+146.940229756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.184075 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.197734 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" podStartSLOduration=122.19771038 podStartE2EDuration="2m2.19771038s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.171458087 +0000 UTC m=+146.433613183" watchObservedRunningTime="2025-10-03 00:09:13.19771038 +0000 UTC m=+146.459865486" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.198784 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.698738774 +0000 UTC m=+146.960893880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.241556 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" podStartSLOduration=123.241521463 podStartE2EDuration="2m3.241521463s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.198948371 +0000 UTC m=+146.461103487" watchObservedRunningTime="2025-10-03 00:09:13.241521463 +0000 UTC m=+146.503676569" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.242179 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g2phc" podStartSLOduration=122.242172105 podStartE2EDuration="2m2.242172105s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.24173707 +0000 UTC m=+146.503892186" watchObservedRunningTime="2025-10-03 00:09:13.242172105 +0000 UTC m=+146.504327211" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.287648 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.289691 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.789670392 +0000 UTC m=+147.051825498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.318571 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zg4z4" podStartSLOduration=122.318550523 podStartE2EDuration="2m2.318550523s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.315263592 +0000 UTC m=+146.577418698" watchObservedRunningTime="2025-10-03 00:09:13.318550523 +0000 UTC m=+146.580705619" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.395662 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.396573 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.896531034 +0000 UTC m=+147.158686150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.497821 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.498053 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.997962635 +0000 UTC m=+147.260117741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.498423 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.498846 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:13.998826783 +0000 UTC m=+147.260981889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.562738 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:13 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:13 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:13 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.562890 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.599296 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.599497 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.099466836 +0000 UTC m=+147.361621942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.599705 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.600221 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.100204482 +0000 UTC m=+147.362359588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.701222 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.701504 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.201477246 +0000 UTC m=+147.463632352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.701909 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.702385 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.202372207 +0000 UTC m=+147.464527313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.803468 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.803660 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.30363156 +0000 UTC m=+147.565786666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.803788 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.804150 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.304140758 +0000 UTC m=+147.566295864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:13 crc kubenswrapper[4832]: I1003 00:09:13.904350 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:13 crc kubenswrapper[4832]: E1003 00:09:13.904663 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.404646746 +0000 UTC m=+147.666801852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.006407 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.006447 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.006498 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.006520 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.006554 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.007974 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.507941089 +0000 UTC m=+147.770096395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.013887 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.017622 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.017773 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.032284 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.107942 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.108198 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.608158789 +0000 UTC m=+147.870313895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.108261 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.108587 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.608564622 +0000 UTC m=+147.870719728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.131608 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6v588" event={"ID":"9157b726-b8a9-497e-b833-8612e28868d1","Type":"ContainerStarted","Data":"6a6267a6a003a6bcfd0c700c59b0d01bf04f91921d56b79f6696c557ebeaeda0"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.134467 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5fz8" event={"ID":"4a1f4f2f-c8de-4020-ac5c-470ca0fd95eb","Type":"ContainerStarted","Data":"b93ed6f4051a9a4dd9e30bcad9b1f545c77a683a3b90fc2dbf3d69a60f5b4948"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.136327 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" event={"ID":"8206a415-c51d-4ed4-bf97-1c00f05810f4","Type":"ContainerStarted","Data":"4e0a4e058c6c55e78fa1368606f535fe75e712124a9f618f332edb931d9e5f7d"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.137430 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.139867 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" event={"ID":"9e7e54a5-be5d-4ae2-a4e2-a5f1a7349af8","Type":"ContainerStarted","Data":"76ed823aed456d42a0ea8f005407b6b34ed56b1c2b169ca8625eb7f352104ecc"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.140158 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.143547 4832 generic.go:334] "Generic (PLEG): container finished" podID="2573fb7d-f41d-4f5d-a2da-b9c276c817f0" containerID="93c2acd8319904c56124101c853a5b5d9efeb0f6f05627ae4a57659887b56577" exitCode=0 Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.143695 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" event={"ID":"2573fb7d-f41d-4f5d-a2da-b9c276c817f0","Type":"ContainerStarted","Data":"30c34dbda3d1c98f62fbf82ef1aa4109a4e08180e0c4636196f2e30796391194"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.143771 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" event={"ID":"2573fb7d-f41d-4f5d-a2da-b9c276c817f0","Type":"ContainerDied","Data":"93c2acd8319904c56124101c853a5b5d9efeb0f6f05627ae4a57659887b56577"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.143895 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.145845 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" event={"ID":"6dba70f2-c60f-4d7a-a288-8597b4f95145","Type":"ContainerStarted","Data":"512ff66b15acba4a4486c9ff7d98f80a15ae8bcff227dd38205b19654e896ae9"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.145876 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" event={"ID":"6dba70f2-c60f-4d7a-a288-8597b4f95145","Type":"ContainerStarted","Data":"1ec6745b0c18c6bf1f265c99d9174898f2b034f5c92b6ccd037213d8c6fc4de3"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.146628 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.147400 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" event={"ID":"8e92b02b-2ba0-4d57-b018-5592e79a239e","Type":"ContainerStarted","Data":"6608ca7d8e259c6f73257c0c0abd3435d86441b37dfa475a854c4ebe38613239"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.148683 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" event={"ID":"ed946177-7292-4d1e-8d89-a7e383701224","Type":"ContainerStarted","Data":"fed4edf3ee4b2223b2471706fca639c7e577496cff9ec39151d9591bf4bfa5bd"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.148711 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" event={"ID":"ed946177-7292-4d1e-8d89-a7e383701224","Type":"ContainerStarted","Data":"0bace74849e1cf094915e0bbb8ecd1f15d57191b060d137117dcbe60f42836af"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.151142 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svzx5" event={"ID":"476e6771-16a7-4aea-bbed-7fce15ec230d","Type":"ContainerStarted","Data":"ccd3e3831ea2422527241f776fdb9ca493add10b34f4c0db8599291e4eb57229"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.151175 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-svzx5" event={"ID":"476e6771-16a7-4aea-bbed-7fce15ec230d","Type":"ContainerStarted","Data":"7b1b8b2a39772b225f954229ac60fb901e6688503184e879914bbdd3071a629a"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.151225 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.157404 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" podStartSLOduration=123.157387734 podStartE2EDuration="2m3.157387734s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:13.363312218 +0000 UTC m=+146.625467324" watchObservedRunningTime="2025-10-03 00:09:14.157387734 +0000 UTC m=+147.419542840" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.159816 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lw2b2" podStartSLOduration=123.159808785 podStartE2EDuration="2m3.159808785s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.154479176 +0000 UTC m=+147.416634282" watchObservedRunningTime="2025-10-03 00:09:14.159808785 +0000 UTC m=+147.421963891" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.165420 4832 generic.go:334] "Generic (PLEG): container finished" podID="7b969c27-e4b6-4cde-8263-b2d879764809" containerID="9b1cdd1d39c3b1f2cef86583f634835f7d0422fe0025fcd6285d0e4c2bb24c43" exitCode=0 Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.165486 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" event={"ID":"7b969c27-e4b6-4cde-8263-b2d879764809","Type":"ContainerDied","Data":"9b1cdd1d39c3b1f2cef86583f634835f7d0422fe0025fcd6285d0e4c2bb24c43"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.175414 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" event={"ID":"e78dfd8f-d109-412c-afcc-2e9fe3786bd3","Type":"ContainerStarted","Data":"f311bc33f18744ff242f4bca6d5fd94066c9bd5f88f7c7611f1ca788ab0e80a7"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.181150 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" podStartSLOduration=124.181130562 podStartE2EDuration="2m4.181130562s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.180309844 +0000 UTC m=+147.442464950" watchObservedRunningTime="2025-10-03 00:09:14.181130562 +0000 UTC m=+147.443285658" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.183409 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" event={"ID":"27f98a08-f7d9-4057-a44c-30c88bfea40a","Type":"ContainerStarted","Data":"4c1f6b5bc280c48832259244b8c4ea4b7994f55d1bb3e21b159a6bef1c221f4d"} Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.186481 4832 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wbjmp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.186528 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.199239 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.206245 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.209179 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.209377 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.70934902 +0000 UTC m=+147.971504126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.209494 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.209889 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.709879948 +0000 UTC m=+147.972035054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.213622 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.226573 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7xrp7" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.245556 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xtqlf" podStartSLOduration=123.245537928 podStartE2EDuration="2m3.245537928s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.244866965 +0000 UTC m=+147.507022071" watchObservedRunningTime="2025-10-03 00:09:14.245537928 +0000 UTC m=+147.507693034" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.246502 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-svzx5" podStartSLOduration=9.246496279 podStartE2EDuration="9.246496279s" podCreationTimestamp="2025-10-03 00:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.197354538 +0000 UTC m=+147.459509644" watchObservedRunningTime="2025-10-03 00:09:14.246496279 +0000 UTC m=+147.508651385" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.280651 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r4wx4" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.310560 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.311980 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wml5t" podStartSLOduration=123.31195214 podStartE2EDuration="2m3.31195214s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.309905222 +0000 UTC m=+147.572060328" watchObservedRunningTime="2025-10-03 00:09:14.31195214 +0000 UTC m=+147.574107246" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.312869 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.81284421 +0000 UTC m=+148.074999316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.359725 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" podStartSLOduration=123.359700275 podStartE2EDuration="2m3.359700275s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:14.3524316 +0000 UTC m=+147.614586706" watchObservedRunningTime="2025-10-03 00:09:14.359700275 +0000 UTC m=+147.621855381" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.413386 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.414149 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:14.914135575 +0000 UTC m=+148.176290681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.515182 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.516970 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.016947942 +0000 UTC m=+148.279103048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.518998 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.519539 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.019529399 +0000 UTC m=+148.281684505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.592334 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:14 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:14 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:14 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.592432 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.624875 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.625383 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.125358117 +0000 UTC m=+148.387513223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.732054 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.733137 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.233120259 +0000 UTC m=+148.495275355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: W1003 00:09:14.771980 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-fd669b2148d7b491d79988aeb4ca46366327ee80eeb62a19a2a4ef13917fe76d WatchSource:0}: Error finding container fd669b2148d7b491d79988aeb4ca46366327ee80eeb62a19a2a4ef13917fe76d: Status 404 returned error can't find the container with id fd669b2148d7b491d79988aeb4ca46366327ee80eeb62a19a2a4ef13917fe76d Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.833251 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.833579 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.333537445 +0000 UTC m=+148.595692561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.833708 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.837782 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.337750217 +0000 UTC m=+148.599905323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.938245 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.938418 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.43839011 +0000 UTC m=+148.700545236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:14 crc kubenswrapper[4832]: I1003 00:09:14.938879 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:14 crc kubenswrapper[4832]: E1003 00:09:14.939223 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.439208108 +0000 UTC m=+148.701363214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.040289 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.040800 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.540774632 +0000 UTC m=+148.802929748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.143460 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.143922 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.64390045 +0000 UTC m=+148.906055556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.184880 4832 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hwfjz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.185004 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" podUID="27f98a08-f7d9-4057-a44c-30c88bfea40a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.210994 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.211274 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" containerID="cri-o://1928c944b1ca0615caa4bcfd67fc4354e5b11f82514ddfc974f8838458f34d63" gracePeriod=30 Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.242995 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c4c02b61981c1ff00f26191f920c2794c476881d8662612fd7f67f285f7a001d"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.245597 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a360e20b0a7b4c66f746778dd7ddeb12112bfa0a3deabab05e35c30483482ea0"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.245645 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7077092d1b1e92a5a0a1c1b75b9909aeca3acc3e1282c4323f1a8dc03c4dd626"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.246191 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.246566 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.74655123 +0000 UTC m=+149.008706336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.246604 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.251700 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" event={"ID":"8e92b02b-2ba0-4d57-b018-5592e79a239e","Type":"ContainerStarted","Data":"1632b99732de3986be89ea46041e1065f9e3051d2cadb5c885f0690a487ed40b"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.271510 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" event={"ID":"7b969c27-e4b6-4cde-8263-b2d879764809","Type":"ContainerStarted","Data":"7b528e8ac4d7aade82212103d13cc3f59f43fc01cddc92b86cc0019d0eff931f"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.282819 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"781ffe0f420c3a4df73059c1c93bdeb3499749fb8587e20ab8e8cc83766ddba7"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.282865 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fd669b2148d7b491d79988aeb4ca46366327ee80eeb62a19a2a4ef13917fe76d"} Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.313785 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" podStartSLOduration=124.31376589 podStartE2EDuration="2m4.31376589s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:15.313336926 +0000 UTC m=+148.575492032" watchObservedRunningTime="2025-10-03 00:09:15.31376589 +0000 UTC m=+148.575920996" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.342365 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hwfjz" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.347541 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.349141 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.849125589 +0000 UTC m=+149.111280695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.448880 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.449082 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.949054159 +0000 UTC m=+149.211209265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.449308 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.451455 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:15.951444279 +0000 UTC m=+149.213599385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.487861 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.489074 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.494789 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.502093 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.551194 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.551422 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.051404439 +0000 UTC m=+149.313559545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.551736 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.552130 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.052121484 +0000 UTC m=+149.314276590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.559680 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:15 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:15 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:15 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.559885 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.652571 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.652738 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.152707955 +0000 UTC m=+149.414863061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.653105 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.653209 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.653365 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.653420 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.153412429 +0000 UTC m=+149.415567535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.653500 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2dt\" (UniqueName: \"kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.658513 4832 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.684955 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.686146 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.689263 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.704377 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.754695 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.754971 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.755013 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2dt\" (UniqueName: \"kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.755059 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.755466 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.755546 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.255530442 +0000 UTC m=+149.517685548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.755748 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.779058 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2dt\" (UniqueName: \"kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt\") pod \"community-operators-lhnfr\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.808924 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.857626 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.857691 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.857714 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.857737 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zgb\" (UniqueName: \"kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.864113 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.364086411 +0000 UTC m=+149.626241507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.886167 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.887431 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.905003 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.958554 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.958860 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.958898 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.958924 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zgb\" (UniqueName: \"kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: E1003 00:09:15.959392 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.459373615 +0000 UTC m=+149.721528721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.959835 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:15 crc kubenswrapper[4832]: I1003 00:09:15.960105 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.000460 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zgb\" (UniqueName: \"kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb\") pod \"certified-operators-j9lmn\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.003077 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.060249 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f22mv\" (UniqueName: \"kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.060282 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.060340 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.060381 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.060705 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.560689401 +0000 UTC m=+149.822844507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.076365 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.077373 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.090610 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.115035 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.126103 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qpxvt" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.161532 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162415 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162458 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162475 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162519 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f22mv\" (UniqueName: \"kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162539 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.162566 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxtj\" (UniqueName: \"kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.162942 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.66266699 +0000 UTC m=+149.924822096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.164011 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.164024 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.211742 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f22mv\" (UniqueName: \"kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv\") pod \"community-operators-dv2sr\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.214715 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.265764 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxtj\" (UniqueName: \"kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.265842 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.265873 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.265901 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.266293 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.266558 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.766546562 +0000 UTC m=+150.028701658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.266776 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.294413 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxtj\" (UniqueName: \"kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj\") pod \"certified-operators-5wfsk\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.367831 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.368477 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.868460078 +0000 UTC m=+150.130615184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.373292 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" event={"ID":"8e92b02b-2ba0-4d57-b018-5592e79a239e","Type":"ContainerStarted","Data":"77a59f1c5d0fed5bab0719ef72d9877fcd71c2ee02769754eb19c5e0097a3063"} Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.414068 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ae8c270bc411057cacbd2734da1b9eeb2477e8a8b7da5b398a7ed6ccfdfe9b64"} Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.417886 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.423483 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.444022 4832 generic.go:334] "Generic (PLEG): container finished" podID="69f97b67-d300-4ef4-8951-934c01b68892" containerID="1928c944b1ca0615caa4bcfd67fc4354e5b11f82514ddfc974f8838458f34d63" exitCode=0 Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.444653 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" event={"ID":"69f97b67-d300-4ef4-8951-934c01b68892","Type":"ContainerDied","Data":"1928c944b1ca0615caa4bcfd67fc4354e5b11f82514ddfc974f8838458f34d63"} Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.444688 4832 scope.go:117] "RemoveContainer" containerID="1928c944b1ca0615caa4bcfd67fc4354e5b11f82514ddfc974f8838458f34d63" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.469850 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.470207 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:16.970193948 +0000 UTC m=+150.232349054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.529524 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.554365 4832 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T00:09:15.658545802Z","Handler":null,"Name":""} Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.558444 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:16 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:16 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:16 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.558499 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570370 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhx9\" (UniqueName: \"kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9\") pod \"69f97b67-d300-4ef4-8951-934c01b68892\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570437 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca\") pod \"69f97b67-d300-4ef4-8951-934c01b68892\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570491 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles\") pod \"69f97b67-d300-4ef4-8951-934c01b68892\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570593 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570649 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert\") pod \"69f97b67-d300-4ef4-8951-934c01b68892\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.570686 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config\") pod \"69f97b67-d300-4ef4-8951-934c01b68892\" (UID: \"69f97b67-d300-4ef4-8951-934c01b68892\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.571534 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca" (OuterVolumeSpecName: "client-ca") pod "69f97b67-d300-4ef4-8951-934c01b68892" (UID: "69f97b67-d300-4ef4-8951-934c01b68892"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.571648 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 00:09:17.071619988 +0000 UTC m=+150.333775094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.571702 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "69f97b67-d300-4ef4-8951-934c01b68892" (UID: "69f97b67-d300-4ef4-8951-934c01b68892"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.572207 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config" (OuterVolumeSpecName: "config") pod "69f97b67-d300-4ef4-8951-934c01b68892" (UID: "69f97b67-d300-4ef4-8951-934c01b68892"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.572362 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.572539 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.572563 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.572579 4832 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69f97b67-d300-4ef4-8951-934c01b68892-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:16 crc kubenswrapper[4832]: E1003 00:09:16.573104 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 00:09:17.073092567 +0000 UTC m=+150.335247853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h4hkd" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.582126 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69f97b67-d300-4ef4-8951-934c01b68892" (UID: "69f97b67-d300-4ef4-8951-934c01b68892"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.582458 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9" (OuterVolumeSpecName: "kube-api-access-pvhx9") pod "69f97b67-d300-4ef4-8951-934c01b68892" (UID: "69f97b67-d300-4ef4-8951-934c01b68892"). InnerVolumeSpecName "kube-api-access-pvhx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.618450 4832 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.618482 4832 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.658162 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.674642 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.675568 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f97b67-d300-4ef4-8951-934c01b68892-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.675588 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhx9\" (UniqueName: \"kubernetes.io/projected/69f97b67-d300-4ef4-8951-934c01b68892-kube-api-access-pvhx9\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.682114 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.776864 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.780772 4832 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.780811 4832 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.797411 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:09:16 crc kubenswrapper[4832]: W1003 00:09:16.807165 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad1920b_7616_41f9_b2f9_84b3073faede.slice/crio-95ab04db99e80add450c9ea4f79641c0ce26656f6c763dfa9ff5d48f2a2c1d09 WatchSource:0}: Error finding container 95ab04db99e80add450c9ea4f79641c0ce26656f6c763dfa9ff5d48f2a2c1d09: Status 404 returned error can't find the container with id 95ab04db99e80add450c9ea4f79641c0ce26656f6c763dfa9ff5d48f2a2c1d09 Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.856112 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.890101 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h4hkd\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:16 crc kubenswrapper[4832]: I1003 00:09:16.943668 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:17 crc kubenswrapper[4832]: W1003 00:09:17.010106 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a851af_b472_4647_8093_8b0cd577b21a.slice/crio-d4304f6f0a5fdcac3fffe0eb904551091ef57cc3e61ef6f248d1400bd6e77bf0 WatchSource:0}: Error finding container d4304f6f0a5fdcac3fffe0eb904551091ef57cc3e61ef6f248d1400bd6e77bf0: Status 404 returned error can't find the container with id d4304f6f0a5fdcac3fffe0eb904551091ef57cc3e61ef6f248d1400bd6e77bf0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.122511 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 00:09:17 crc kubenswrapper[4832]: E1003 00:09:17.123115 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.123131 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.123220 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.123620 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.132624 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.132896 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.133128 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.137789 4832 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r6kz6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.137832 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" podUID="69f97b67-d300-4ef4-8951-934c01b68892" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.257027 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.257729 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.276014 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.282818 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.282938 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384108 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384178 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384211 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384232 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384252 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384352 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384379 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.384514 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmts7\" (UniqueName: \"kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.394196 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.428196 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.450734 4832 generic.go:334] "Generic (PLEG): container finished" podID="93fca13a-c8e1-4239-b419-e4055be395cd" containerID="2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae" exitCode=0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.450879 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerDied","Data":"2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.451229 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerStarted","Data":"a2b84add0e4e942439bbc953a70702f5a9418a464e6b5ff604633054e4887e74"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.453021 4832 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.462197 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.473607 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" event={"ID":"8e92b02b-2ba0-4d57-b018-5592e79a239e","Type":"ContainerStarted","Data":"040ef48b70c86f14226d8e7937e9e7bbfa77d49a49ca6a93676b2f7b25aeb3f3"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.483994 4832 generic.go:334] "Generic (PLEG): container finished" podID="13a851af-b472-4647-8093-8b0cd577b21a" containerID="73ebff37ae24d4ae4ad91cfe5e6f84ee746fd7e5dcb4d80a65be7e3a960e7f58" exitCode=0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.484105 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerDied","Data":"73ebff37ae24d4ae4ad91cfe5e6f84ee746fd7e5dcb4d80a65be7e3a960e7f58"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.484141 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerStarted","Data":"d4304f6f0a5fdcac3fffe0eb904551091ef57cc3e61ef6f248d1400bd6e77bf0"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.485227 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.485260 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.485292 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.485315 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmts7\" (UniqueName: \"kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.485363 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.486580 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.486627 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.487344 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.488931 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.489853 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.501661 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.515433 4832 generic.go:334] "Generic (PLEG): container finished" podID="bad1920b-7616-41f9-b2f9-84b3073faede" containerID="bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6" exitCode=0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.518346 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerDied","Data":"bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.521250 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerStarted","Data":"95ab04db99e80add450c9ea4f79641c0ce26656f6c763dfa9ff5d48f2a2c1d09"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.535322 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.540811 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmts7\" (UniqueName: \"kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7\") pod \"controller-manager-879f6c89f-mwvp8\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.550315 4832 generic.go:334] "Generic (PLEG): container finished" podID="e78dfd8f-d109-412c-afcc-2e9fe3786bd3" containerID="f311bc33f18744ff242f4bca6d5fd94066c9bd5f88f7c7611f1ca788ab0e80a7" exitCode=0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.550424 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" event={"ID":"e78dfd8f-d109-412c-afcc-2e9fe3786bd3","Type":"ContainerDied","Data":"f311bc33f18744ff242f4bca6d5fd94066c9bd5f88f7c7611f1ca788ab0e80a7"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.553452 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.558912 4832 patch_prober.go:28] interesting pod/router-default-5444994796-w8cn7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 00:09:17 crc kubenswrapper[4832]: [-]has-synced failed: reason withheld Oct 03 00:09:17 crc kubenswrapper[4832]: [+]process-running ok Oct 03 00:09:17 crc kubenswrapper[4832]: healthz check failed Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.558975 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8cn7" podUID="28399927-5270-4b93-ae66-5c08a318b376" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.560566 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.560723 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6kz6" event={"ID":"69f97b67-d300-4ef4-8951-934c01b68892","Type":"ContainerDied","Data":"9ba7a15efaa47c8f412ab0847791326a696d4804ea562b9bc17428a3be6238fe"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.571285 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.592231 4832 generic.go:334] "Generic (PLEG): container finished" podID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerID="47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4" exitCode=0 Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.594116 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.594159 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzpx\" (UniqueName: \"kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.594203 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.595248 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerDied","Data":"47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.595303 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerStarted","Data":"a160fc41a210d7d74d7ca391fdc9fbca2dd10509759cfcb83ca6e55ab68242a0"} Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.595322 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.660387 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.669984 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6kz6"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.702237 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.702616 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzpx\" (UniqueName: \"kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.702651 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.703717 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.703981 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.745319 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzpx\" (UniqueName: \"kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx\") pod \"redhat-marketplace-28hml\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.798931 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-g59tb" podStartSLOduration=12.798907868 podStartE2EDuration="12.798907868s" podCreationTimestamp="2025-10-03 00:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:17.798409031 +0000 UTC m=+151.060564157" watchObservedRunningTime="2025-10-03 00:09:17.798907868 +0000 UTC m=+151.061062974" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.838670 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.839663 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.840901 4832 patch_prober.go:28] interesting pod/console-f9d7485db-hxv6p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.840947 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hxv6p" podUID="a3407ec5-4299-4c9d-9b37-64448de352b7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.865736 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.875601 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.877498 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.899087 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:09:17 crc kubenswrapper[4832]: I1003 00:09:17.943776 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.002577 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.006415 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-297jq\" (UniqueName: \"kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.006483 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.006505 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.013389 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zzj42" Oct 03 00:09:18 crc kubenswrapper[4832]: W1003 00:09:18.091590 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d0ed4c_fccc_4cec_bffb_60bf97896eb9.slice/crio-b0961b70467d802766ab122d53e01d0ed191326def62f02f0d439c65d35740f4 WatchSource:0}: Error finding container b0961b70467d802766ab122d53e01d0ed191326def62f02f0d439c65d35740f4: Status 404 returned error can't find the container with id b0961b70467d802766ab122d53e01d0ed191326def62f02f0d439c65d35740f4 Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.099399 4832 patch_prober.go:28] interesting pod/downloads-7954f5f757-znmk2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.099451 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-znmk2" podUID="c73e2dc6-6174-4a76-a7d2-192ef93b7935" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.099829 4832 patch_prober.go:28] interesting pod/downloads-7954f5f757-znmk2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.099849 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-znmk2" podUID="c73e2dc6-6174-4a76-a7d2-192ef93b7935" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.107995 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-297jq\" (UniqueName: \"kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.108036 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.108053 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.109898 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.110141 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.169938 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-297jq\" (UniqueName: \"kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq\") pod \"redhat-marketplace-4596m\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.217133 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.431365 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.555351 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.565524 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.578713 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.617331 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerStarted","Data":"f4ba2008b67e2cf4884380503313b95611c76fb8c023e1001173efbb3021d777"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.624980 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6022982-53b9-4c59-af62-90f4368e2744","Type":"ContainerStarted","Data":"ccbe0cf01977a41c99998c104bdbc78f97c353a335f3135306973fea63888b38"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.625570 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6022982-53b9-4c59-af62-90f4368e2744","Type":"ContainerStarted","Data":"5760af10582d55be8dd490a1f2fb93ae9a94050521955720c870cd1144d8d848"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.631717 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" event={"ID":"8793993f-a364-4eb7-bb10-d30f58f0e2c0","Type":"ContainerStarted","Data":"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.631781 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" event={"ID":"8793993f-a364-4eb7-bb10-d30f58f0e2c0","Type":"ContainerStarted","Data":"140f7e1baf89868a37933c11aa634084ea0b0c952f355d85990e729aa6bdb77e"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.632637 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.647211 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" event={"ID":"42d0ed4c-fccc-4cec-bffb-60bf97896eb9","Type":"ContainerStarted","Data":"c463fb4ee70659b32d53c6bf73950119c1675dc95ab32f2547252918ee9db46c"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.649804 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" event={"ID":"42d0ed4c-fccc-4cec-bffb-60bf97896eb9","Type":"ContainerStarted","Data":"b0961b70467d802766ab122d53e01d0ed191326def62f02f0d439c65d35740f4"} Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.649904 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.652067 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.652055191 podStartE2EDuration="1.652055191s" podCreationTimestamp="2025-10-03 00:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:18.650223649 +0000 UTC m=+151.912378755" watchObservedRunningTime="2025-10-03 00:09:18.652055191 +0000 UTC m=+151.914210307" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.653762 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w8cn7" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.666389 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.683378 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.684846 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.688467 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.741140 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" podStartSLOduration=127.741116325 podStartE2EDuration="2m7.741116325s" podCreationTimestamp="2025-10-03 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:18.685988381 +0000 UTC m=+151.948143497" watchObservedRunningTime="2025-10-03 00:09:18.741116325 +0000 UTC m=+152.003271431" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.745872 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.749400 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" podStartSLOduration=3.749372633 podStartE2EDuration="3.749372633s" podCreationTimestamp="2025-10-03 00:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:18.746310069 +0000 UTC m=+152.008465165" watchObservedRunningTime="2025-10-03 00:09:18.749372633 +0000 UTC m=+152.011527739" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.790151 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.843121 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.843241 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkffr\" (UniqueName: \"kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.843345 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.962967 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.963462 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkffr\" (UniqueName: \"kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.963518 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.964546 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.964649 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:18 crc kubenswrapper[4832]: I1003 00:09:18.995447 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkffr\" (UniqueName: \"kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr\") pod \"redhat-operators-46d2w\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.030080 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.074358 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.085726 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.100188 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.100227 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.110685 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.112934 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.141813 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.234038 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.270018 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqs62\" (UniqueName: \"kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.270373 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.270455 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.371390 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume\") pod \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.371461 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd7nk\" (UniqueName: \"kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk\") pod \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.371559 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume\") pod \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\" (UID: \"e78dfd8f-d109-412c-afcc-2e9fe3786bd3\") " Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.373214 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.373957 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume" (OuterVolumeSpecName: "config-volume") pod "e78dfd8f-d109-412c-afcc-2e9fe3786bd3" (UID: "e78dfd8f-d109-412c-afcc-2e9fe3786bd3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.379539 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.379782 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqs62\" (UniqueName: \"kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.379908 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.380030 4832 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.380401 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.382178 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk" (OuterVolumeSpecName: "kube-api-access-jd7nk") pod "e78dfd8f-d109-412c-afcc-2e9fe3786bd3" (UID: "e78dfd8f-d109-412c-afcc-2e9fe3786bd3"). InnerVolumeSpecName "kube-api-access-jd7nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.390573 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e78dfd8f-d109-412c-afcc-2e9fe3786bd3" (UID: "e78dfd8f-d109-412c-afcc-2e9fe3786bd3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.399087 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqs62\" (UniqueName: \"kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62\") pod \"redhat-operators-lc7px\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.403090 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f97b67-d300-4ef4-8951-934c01b68892" path="/var/lib/kubelet/pods/69f97b67-d300-4ef4-8951-934c01b68892/volumes" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.444127 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.481310 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd7nk\" (UniqueName: \"kubernetes.io/projected/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-kube-api-access-jd7nk\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.481347 4832 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e78dfd8f-d109-412c-afcc-2e9fe3786bd3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.664988 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.685494 4832 generic.go:334] "Generic (PLEG): container finished" podID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerID="18c8dd743b8f3e796a694e45bc47f03a1f6f5d10bd51898422bf931113c59fde" exitCode=0 Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.685560 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerDied","Data":"18c8dd743b8f3e796a694e45bc47f03a1f6f5d10bd51898422bf931113c59fde"} Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.702231 4832 generic.go:334] "Generic (PLEG): container finished" podID="549600c0-8e76-4369-bcf8-8d0176082edc" containerID="a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566" exitCode=0 Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.702305 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerDied","Data":"a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566"} Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.702339 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerStarted","Data":"bc1cf62dc334468735ea8fb70754d64b177db033bb0c48975860d51c5c720930"} Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.716422 4832 generic.go:334] "Generic (PLEG): container finished" podID="c6022982-53b9-4c59-af62-90f4368e2744" containerID="ccbe0cf01977a41c99998c104bdbc78f97c353a335f3135306973fea63888b38" exitCode=0 Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.716890 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6022982-53b9-4c59-af62-90f4368e2744","Type":"ContainerDied","Data":"ccbe0cf01977a41c99998c104bdbc78f97c353a335f3135306973fea63888b38"} Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.739867 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" event={"ID":"e78dfd8f-d109-412c-afcc-2e9fe3786bd3","Type":"ContainerDied","Data":"f42af82c3e084cc1af6355595344a43e09a34ba7a147ea90d47f225177012fef"} Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.739914 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f42af82c3e084cc1af6355595344a43e09a34ba7a147ea90d47f225177012fef" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.740241 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.747965 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jfkwz" Oct 03 00:09:19 crc kubenswrapper[4832]: I1003 00:09:19.876323 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.762457 4832 generic.go:334] "Generic (PLEG): container finished" podID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerID="95e4e53f36635761f69260037c83ff451793827f13a28db826b99a81c2884cb9" exitCode=0 Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.762553 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerDied","Data":"95e4e53f36635761f69260037c83ff451793827f13a28db826b99a81c2884cb9"} Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.762798 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerStarted","Data":"e2914fcd2e30098d364fe276059547f95e7f11f836e8e4601c87291e647c14f6"} Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.773160 4832 generic.go:334] "Generic (PLEG): container finished" podID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerID="fe3d3c2378be42897840528b565eca1ece459339ea7046e05847c42cf47c2972" exitCode=0 Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.773365 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerDied","Data":"fe3d3c2378be42897840528b565eca1ece459339ea7046e05847c42cf47c2972"} Oct 03 00:09:20 crc kubenswrapper[4832]: I1003 00:09:20.773396 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerStarted","Data":"502be34ac64e8e927e09370f86b6e8becfe0ab6de9442f0f0c571be7bcbdeb5b"} Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.026187 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.145880 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access\") pod \"c6022982-53b9-4c59-af62-90f4368e2744\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.146022 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir\") pod \"c6022982-53b9-4c59-af62-90f4368e2744\" (UID: \"c6022982-53b9-4c59-af62-90f4368e2744\") " Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.146371 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c6022982-53b9-4c59-af62-90f4368e2744" (UID: "c6022982-53b9-4c59-af62-90f4368e2744"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.153142 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c6022982-53b9-4c59-af62-90f4368e2744" (UID: "c6022982-53b9-4c59-af62-90f4368e2744"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.247503 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6022982-53b9-4c59-af62-90f4368e2744-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.247541 4832 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6022982-53b9-4c59-af62-90f4368e2744-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.469780 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 00:09:21 crc kubenswrapper[4832]: E1003 00:09:21.470296 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78dfd8f-d109-412c-afcc-2e9fe3786bd3" containerName="collect-profiles" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.470412 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78dfd8f-d109-412c-afcc-2e9fe3786bd3" containerName="collect-profiles" Oct 03 00:09:21 crc kubenswrapper[4832]: E1003 00:09:21.470563 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6022982-53b9-4c59-af62-90f4368e2744" containerName="pruner" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.470646 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6022982-53b9-4c59-af62-90f4368e2744" containerName="pruner" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.471105 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78dfd8f-d109-412c-afcc-2e9fe3786bd3" containerName="collect-profiles" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.471203 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6022982-53b9-4c59-af62-90f4368e2744" containerName="pruner" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.471855 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.474513 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.476770 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.476862 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.554405 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.554466 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.657224 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.657298 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.657701 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.696390 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.781424 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6022982-53b9-4c59-af62-90f4368e2744","Type":"ContainerDied","Data":"5760af10582d55be8dd490a1f2fb93ae9a94050521955720c870cd1144d8d848"} Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.781487 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5760af10582d55be8dd490a1f2fb93ae9a94050521955720c870cd1144d8d848" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.781550 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 00:09:21 crc kubenswrapper[4832]: I1003 00:09:21.817403 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:22 crc kubenswrapper[4832]: I1003 00:09:22.287330 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 00:09:22 crc kubenswrapper[4832]: I1003 00:09:22.796381 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217","Type":"ContainerStarted","Data":"8514223da23b2b9459a708945e642ad994e1a2151229bae808b93b0b50a5102e"} Oct 03 00:09:23 crc kubenswrapper[4832]: I1003 00:09:23.804742 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217","Type":"ContainerStarted","Data":"154aa13389838a39e50db86b68fbaa35f1c8793ccb11469f55210e383a15f167"} Oct 03 00:09:24 crc kubenswrapper[4832]: I1003 00:09:24.173553 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-svzx5" Oct 03 00:09:24 crc kubenswrapper[4832]: I1003 00:09:24.191887 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.191867535 podStartE2EDuration="3.191867535s" podCreationTimestamp="2025-10-03 00:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:23.81930893 +0000 UTC m=+157.081464046" watchObservedRunningTime="2025-10-03 00:09:24.191867535 +0000 UTC m=+157.454022641" Oct 03 00:09:24 crc kubenswrapper[4832]: I1003 00:09:24.813496 4832 generic.go:334] "Generic (PLEG): container finished" podID="8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" containerID="154aa13389838a39e50db86b68fbaa35f1c8793ccb11469f55210e383a15f167" exitCode=0 Oct 03 00:09:24 crc kubenswrapper[4832]: I1003 00:09:24.813559 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217","Type":"ContainerDied","Data":"154aa13389838a39e50db86b68fbaa35f1c8793ccb11469f55210e383a15f167"} Oct 03 00:09:26 crc kubenswrapper[4832]: I1003 00:09:26.396652 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:09:26 crc kubenswrapper[4832]: I1003 00:09:26.397270 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:09:27 crc kubenswrapper[4832]: I1003 00:09:27.840538 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:27 crc kubenswrapper[4832]: I1003 00:09:27.846103 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hxv6p" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.104785 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-znmk2" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.315617 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.390283 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access\") pod \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.390371 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir\") pod \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\" (UID: \"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217\") " Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.390522 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" (UID: "8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.390671 4832 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.398962 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" (UID: "8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.492488 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.835815 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217","Type":"ContainerDied","Data":"8514223da23b2b9459a708945e642ad994e1a2151229bae808b93b0b50a5102e"} Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.835874 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8514223da23b2b9459a708945e642ad994e1a2151229bae808b93b0b50a5102e" Oct 03 00:09:28 crc kubenswrapper[4832]: I1003 00:09:28.835833 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 00:09:33 crc kubenswrapper[4832]: I1003 00:09:33.685716 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:33 crc kubenswrapper[4832]: I1003 00:09:33.692151 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e796ab-f354-47c2-8f1b-1d29fe9ab059-metrics-certs\") pod \"network-metrics-daemon-ppthn\" (UID: \"f3e796ab-f354-47c2-8f1b-1d29fe9ab059\") " pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:33 crc kubenswrapper[4832]: I1003 00:09:33.795747 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ppthn" Oct 03 00:09:36 crc kubenswrapper[4832]: I1003 00:09:36.954050 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:09:42 crc kubenswrapper[4832]: I1003 00:09:42.920302 4832 generic.go:334] "Generic (PLEG): container finished" podID="3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" containerID="9b97b4bc4c5797363fcf9c6b533a58746dc4bd1bd30b05196ed45e9bc9a5ed51" exitCode=0 Oct 03 00:09:42 crc kubenswrapper[4832]: I1003 00:09:42.920402 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29324160-vrdgr" event={"ID":"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe","Type":"ContainerDied","Data":"9b97b4bc4c5797363fcf9c6b533a58746dc4bd1bd30b05196ed45e9bc9a5ed51"} Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.382879 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.421034 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.421201 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xd2dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lhnfr_openshift-marketplace(bbb2ac77-f639-41e8-897d-7744dd4e987d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.422385 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lhnfr" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.469429 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca\") pod \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.469563 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j75j\" (UniqueName: \"kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j\") pod \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\" (UID: \"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe\") " Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.470036 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca" (OuterVolumeSpecName: "serviceca") pod "3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" (UID: "3335886b-58ec-44d4-9b6a-b7d81b4aa3fe"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.474960 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j" (OuterVolumeSpecName: "kube-api-access-7j75j") pod "3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" (UID: "3335886b-58ec-44d4-9b6a-b7d81b4aa3fe"). InnerVolumeSpecName "kube-api-access-7j75j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.571346 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j75j\" (UniqueName: \"kubernetes.io/projected/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-kube-api-access-7j75j\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.571381 4832 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3335886b-58ec-44d4-9b6a-b7d81b4aa3fe-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.704772 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.705228 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfzpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-28hml_openshift-marketplace(99b27c40-f8ce-41a0-befc-18366ba6770b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" logger="UnhandledError" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.706386 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-28hml" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.770074 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.770344 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f22mv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dv2sr_openshift-marketplace(13a851af-b472-4647-8093-8b0cd577b21a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 00:09:47 crc kubenswrapper[4832]: E1003 00:09:47.771803 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dv2sr" podUID="13a851af-b472-4647-8093-8b0cd577b21a" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.946627 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29324160-vrdgr" event={"ID":"3335886b-58ec-44d4-9b6a-b7d81b4aa3fe","Type":"ContainerDied","Data":"079476f1b4f6b1ce0eebb29031c65aef31d18ec1803bf0dffdd295a8e2ab1466"} Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.946677 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079476f1b4f6b1ce0eebb29031c65aef31d18ec1803bf0dffdd295a8e2ab1466" Oct 03 00:09:47 crc kubenswrapper[4832]: I1003 00:09:47.946697 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29324160-vrdgr" Oct 03 00:09:48 crc kubenswrapper[4832]: I1003 00:09:48.970855 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5pfg" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.022196 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lhnfr" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.022306 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dv2sr" podUID="13a851af-b472-4647-8093-8b0cd577b21a" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.022567 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-28hml" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.101241 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.101665 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2lxtj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5wfsk_openshift-marketplace(bad1920b-7616-41f9-b2f9-84b3073faede): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 00:09:49 crc kubenswrapper[4832]: E1003 00:09:49.102729 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5wfsk" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" Oct 03 00:09:51 crc kubenswrapper[4832]: E1003 00:09:51.538857 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5wfsk" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" Oct 03 00:09:54 crc kubenswrapper[4832]: I1003 00:09:54.203461 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 00:09:55 crc kubenswrapper[4832]: E1003 00:09:55.105981 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 00:09:55 crc kubenswrapper[4832]: E1003 00:09:55.106137 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-297jq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4596m_openshift-marketplace(549600c0-8e76-4369-bcf8-8d0176082edc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 00:09:55 crc kubenswrapper[4832]: E1003 00:09:55.108369 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4596m" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.326207 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ppthn"] Oct 03 00:09:55 crc kubenswrapper[4832]: W1003 00:09:55.338344 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3e796ab_f354_47c2_8f1b_1d29fe9ab059.slice/crio-fe03cb1e94bcc0ab98cc1db763bb19e5d5f9782358b46e220f50d422be2152de WatchSource:0}: Error finding container fe03cb1e94bcc0ab98cc1db763bb19e5d5f9782358b46e220f50d422be2152de: Status 404 returned error can't find the container with id fe03cb1e94bcc0ab98cc1db763bb19e5d5f9782358b46e220f50d422be2152de Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.988125 4832 generic.go:334] "Generic (PLEG): container finished" podID="93fca13a-c8e1-4239-b419-e4055be395cd" containerID="8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431" exitCode=0 Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.988230 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerDied","Data":"8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431"} Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.990867 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerStarted","Data":"7087abea0470ec63e9b19c89006ea2d6c130ca8dc732a366288bc8a233e33b96"} Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.993267 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerStarted","Data":"eb5ff900930503b1f77b21f28af1b084dc0d107f1fbffcabc575f1adbb302cbb"} Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.995883 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ppthn" event={"ID":"f3e796ab-f354-47c2-8f1b-1d29fe9ab059","Type":"ContainerStarted","Data":"5568dbb82c3eeb745150ca84db4751874b4be687337925ce8d1efbd69cd2f16f"} Oct 03 00:09:55 crc kubenswrapper[4832]: I1003 00:09:55.995926 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ppthn" event={"ID":"f3e796ab-f354-47c2-8f1b-1d29fe9ab059","Type":"ContainerStarted","Data":"fe03cb1e94bcc0ab98cc1db763bb19e5d5f9782358b46e220f50d422be2152de"} Oct 03 00:09:55 crc kubenswrapper[4832]: E1003 00:09:55.996971 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4596m" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" Oct 03 00:09:56 crc kubenswrapper[4832]: I1003 00:09:56.396860 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:09:56 crc kubenswrapper[4832]: I1003 00:09:56.396911 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:09:57 crc kubenswrapper[4832]: I1003 00:09:57.001484 4832 generic.go:334] "Generic (PLEG): container finished" podID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerID="7087abea0470ec63e9b19c89006ea2d6c130ca8dc732a366288bc8a233e33b96" exitCode=0 Oct 03 00:09:57 crc kubenswrapper[4832]: I1003 00:09:57.001540 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerDied","Data":"7087abea0470ec63e9b19c89006ea2d6c130ca8dc732a366288bc8a233e33b96"} Oct 03 00:09:57 crc kubenswrapper[4832]: I1003 00:09:57.005574 4832 generic.go:334] "Generic (PLEG): container finished" podID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerID="eb5ff900930503b1f77b21f28af1b084dc0d107f1fbffcabc575f1adbb302cbb" exitCode=0 Oct 03 00:09:57 crc kubenswrapper[4832]: I1003 00:09:57.005689 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerDied","Data":"eb5ff900930503b1f77b21f28af1b084dc0d107f1fbffcabc575f1adbb302cbb"} Oct 03 00:09:58 crc kubenswrapper[4832]: I1003 00:09:58.012993 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ppthn" event={"ID":"f3e796ab-f354-47c2-8f1b-1d29fe9ab059","Type":"ContainerStarted","Data":"1939eae632a95cfab4807b8b0c172f4f71c6b86c3c21733b83f7342f32730516"} Oct 03 00:09:59 crc kubenswrapper[4832]: I1003 00:09:59.042121 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ppthn" podStartSLOduration=169.04209717 podStartE2EDuration="2m49.04209717s" podCreationTimestamp="2025-10-03 00:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:09:59.034277147 +0000 UTC m=+192.296432273" watchObservedRunningTime="2025-10-03 00:09:59.04209717 +0000 UTC m=+192.304252296" Oct 03 00:10:02 crc kubenswrapper[4832]: I1003 00:10:02.039440 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerStarted","Data":"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9"} Oct 03 00:10:02 crc kubenswrapper[4832]: I1003 00:10:02.062423 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9lmn" podStartSLOduration=3.659900804 podStartE2EDuration="47.062386339s" podCreationTimestamp="2025-10-03 00:09:15 +0000 UTC" firstStartedPulling="2025-10-03 00:09:17.45271106 +0000 UTC m=+150.714866166" lastFinishedPulling="2025-10-03 00:10:00.855196595 +0000 UTC m=+194.117351701" observedRunningTime="2025-10-03 00:10:02.057106802 +0000 UTC m=+195.319261908" watchObservedRunningTime="2025-10-03 00:10:02.062386339 +0000 UTC m=+195.324541455" Oct 03 00:10:05 crc kubenswrapper[4832]: I1003 00:10:05.055609 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerStarted","Data":"b34ca52dc0bc12b3e1deda133a0ad0990041099c97565acb489aa799b80a5b56"} Oct 03 00:10:05 crc kubenswrapper[4832]: I1003 00:10:05.074940 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lc7px" podStartSLOduration=2.864290404 podStartE2EDuration="46.074921108s" podCreationTimestamp="2025-10-03 00:09:19 +0000 UTC" firstStartedPulling="2025-10-03 00:09:20.778106287 +0000 UTC m=+154.040261393" lastFinishedPulling="2025-10-03 00:10:03.988736991 +0000 UTC m=+197.250892097" observedRunningTime="2025-10-03 00:10:05.071281576 +0000 UTC m=+198.333436702" watchObservedRunningTime="2025-10-03 00:10:05.074921108 +0000 UTC m=+198.337076214" Oct 03 00:10:06 crc kubenswrapper[4832]: I1003 00:10:06.003684 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:10:06 crc kubenswrapper[4832]: I1003 00:10:06.003777 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:10:07 crc kubenswrapper[4832]: I1003 00:10:07.612742 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j9lmn" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="registry-server" probeResult="failure" output=< Oct 03 00:10:07 crc kubenswrapper[4832]: timeout: failed to connect service ":50051" within 1s Oct 03 00:10:07 crc kubenswrapper[4832]: > Oct 03 00:10:08 crc kubenswrapper[4832]: I1003 00:10:08.074408 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerStarted","Data":"6ee1a34ad047636439552083715a9e26012d5619b3bf95b5fdfa309e942dd909"} Oct 03 00:10:09 crc kubenswrapper[4832]: I1003 00:10:09.096865 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-46d2w" podStartSLOduration=5.677611136 podStartE2EDuration="51.096849501s" podCreationTimestamp="2025-10-03 00:09:18 +0000 UTC" firstStartedPulling="2025-10-03 00:09:20.768872296 +0000 UTC m=+154.031027402" lastFinishedPulling="2025-10-03 00:10:06.188110661 +0000 UTC m=+199.450265767" observedRunningTime="2025-10-03 00:10:09.094836867 +0000 UTC m=+202.356991993" watchObservedRunningTime="2025-10-03 00:10:09.096849501 +0000 UTC m=+202.359004607" Oct 03 00:10:09 crc kubenswrapper[4832]: I1003 00:10:09.142866 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:10:09 crc kubenswrapper[4832]: I1003 00:10:09.142934 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:10:09 crc kubenswrapper[4832]: I1003 00:10:09.444784 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:09 crc kubenswrapper[4832]: I1003 00:10:09.444863 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:10 crc kubenswrapper[4832]: I1003 00:10:10.213482 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-46d2w" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="registry-server" probeResult="failure" output=< Oct 03 00:10:10 crc kubenswrapper[4832]: timeout: failed to connect service ":50051" within 1s Oct 03 00:10:10 crc kubenswrapper[4832]: > Oct 03 00:10:10 crc kubenswrapper[4832]: I1003 00:10:10.481253 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lc7px" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="registry-server" probeResult="failure" output=< Oct 03 00:10:10 crc kubenswrapper[4832]: timeout: failed to connect service ":50051" within 1s Oct 03 00:10:10 crc kubenswrapper[4832]: > Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.122995 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.132477 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerStarted","Data":"b896c9e938a8a43b0393a0ae5b8dd56b10cfc62f8b17b098f963774631f7f963"} Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.135029 4832 generic.go:334] "Generic (PLEG): container finished" podID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerID="88d8364072fbe04131882db71556d56e42b7e320648f0032fe1636e23ba80493" exitCode=0 Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.135088 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerDied","Data":"88d8364072fbe04131882db71556d56e42b7e320648f0032fe1636e23ba80493"} Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.139372 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerStarted","Data":"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f"} Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.147960 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerStarted","Data":"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e"} Oct 03 00:10:16 crc kubenswrapper[4832]: I1003 00:10:16.216104 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.160288 4832 generic.go:334] "Generic (PLEG): container finished" podID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerID="6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e" exitCode=0 Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.160340 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerDied","Data":"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e"} Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.164555 4832 generic.go:334] "Generic (PLEG): container finished" podID="13a851af-b472-4647-8093-8b0cd577b21a" containerID="b896c9e938a8a43b0393a0ae5b8dd56b10cfc62f8b17b098f963774631f7f963" exitCode=0 Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.164624 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerDied","Data":"b896c9e938a8a43b0393a0ae5b8dd56b10cfc62f8b17b098f963774631f7f963"} Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.167945 4832 generic.go:334] "Generic (PLEG): container finished" podID="bad1920b-7616-41f9-b2f9-84b3073faede" containerID="7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f" exitCode=0 Oct 03 00:10:17 crc kubenswrapper[4832]: I1003 00:10:17.168417 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerDied","Data":"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.176478 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerStarted","Data":"08b1bd359494f6e779f801c60e27af828b455cdb19f62f107465789350306ed3"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.179097 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerStarted","Data":"27d909a22d477b03dcbbd948ad95d924e6bf452aa738d6e3935898779acf57b1"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.181659 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerStarted","Data":"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.183522 4832 generic.go:334] "Generic (PLEG): container finished" podID="549600c0-8e76-4369-bcf8-8d0176082edc" containerID="dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf" exitCode=0 Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.183591 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerDied","Data":"dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.186033 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerStarted","Data":"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be"} Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.196257 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dv2sr" podStartSLOduration=2.889968058 podStartE2EDuration="1m3.196238184s" podCreationTimestamp="2025-10-03 00:09:15 +0000 UTC" firstStartedPulling="2025-10-03 00:09:17.488964248 +0000 UTC m=+150.751119354" lastFinishedPulling="2025-10-03 00:10:17.795234334 +0000 UTC m=+211.057389480" observedRunningTime="2025-10-03 00:10:18.193114942 +0000 UTC m=+211.455270048" watchObservedRunningTime="2025-10-03 00:10:18.196238184 +0000 UTC m=+211.458393290" Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.210506 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5wfsk" podStartSLOduration=2.168253077 podStartE2EDuration="1m2.210487483s" podCreationTimestamp="2025-10-03 00:09:16 +0000 UTC" firstStartedPulling="2025-10-03 00:09:17.530705062 +0000 UTC m=+150.792860168" lastFinishedPulling="2025-10-03 00:10:17.572939468 +0000 UTC m=+210.835094574" observedRunningTime="2025-10-03 00:10:18.208453529 +0000 UTC m=+211.470608635" watchObservedRunningTime="2025-10-03 00:10:18.210487483 +0000 UTC m=+211.472642589" Oct 03 00:10:18 crc kubenswrapper[4832]: I1003 00:10:18.242141 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lhnfr" podStartSLOduration=3.06896524 podStartE2EDuration="1m3.242121143s" podCreationTimestamp="2025-10-03 00:09:15 +0000 UTC" firstStartedPulling="2025-10-03 00:09:17.59996422 +0000 UTC m=+150.862119316" lastFinishedPulling="2025-10-03 00:10:17.773120113 +0000 UTC m=+211.035275219" observedRunningTime="2025-10-03 00:10:18.240392295 +0000 UTC m=+211.502547401" watchObservedRunningTime="2025-10-03 00:10:18.242121143 +0000 UTC m=+211.504276249" Oct 03 00:10:19 crc kubenswrapper[4832]: I1003 00:10:19.183453 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:10:19 crc kubenswrapper[4832]: I1003 00:10:19.201486 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28hml" podStartSLOduration=4.371481177 podStartE2EDuration="1m2.201458941s" podCreationTimestamp="2025-10-03 00:09:17 +0000 UTC" firstStartedPulling="2025-10-03 00:09:19.695410547 +0000 UTC m=+152.957565653" lastFinishedPulling="2025-10-03 00:10:17.525388311 +0000 UTC m=+210.787543417" observedRunningTime="2025-10-03 00:10:18.262063458 +0000 UTC m=+211.524218564" watchObservedRunningTime="2025-10-03 00:10:19.201458941 +0000 UTC m=+212.463614047" Oct 03 00:10:19 crc kubenswrapper[4832]: I1003 00:10:19.226113 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:10:19 crc kubenswrapper[4832]: I1003 00:10:19.484730 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:19 crc kubenswrapper[4832]: I1003 00:10:19.524733 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:20 crc kubenswrapper[4832]: I1003 00:10:20.198058 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerStarted","Data":"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0"} Oct 03 00:10:20 crc kubenswrapper[4832]: I1003 00:10:20.220673 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4596m" podStartSLOduration=3.2908429200000002 podStartE2EDuration="1m3.220653703s" podCreationTimestamp="2025-10-03 00:09:17 +0000 UTC" firstStartedPulling="2025-10-03 00:09:19.708645702 +0000 UTC m=+152.970800808" lastFinishedPulling="2025-10-03 00:10:19.638456485 +0000 UTC m=+212.900611591" observedRunningTime="2025-10-03 00:10:20.21568343 +0000 UTC m=+213.477838536" watchObservedRunningTime="2025-10-03 00:10:20.220653703 +0000 UTC m=+213.482808809" Oct 03 00:10:22 crc kubenswrapper[4832]: I1003 00:10:22.499700 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:10:22 crc kubenswrapper[4832]: I1003 00:10:22.500248 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lc7px" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="registry-server" containerID="cri-o://b34ca52dc0bc12b3e1deda133a0ad0990041099c97565acb489aa799b80a5b56" gracePeriod=2 Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.216664 4832 generic.go:334] "Generic (PLEG): container finished" podID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerID="b34ca52dc0bc12b3e1deda133a0ad0990041099c97565acb489aa799b80a5b56" exitCode=0 Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.216722 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerDied","Data":"b34ca52dc0bc12b3e1deda133a0ad0990041099c97565acb489aa799b80a5b56"} Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.415402 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.477487 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content\") pod \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.477578 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities\") pod \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.477670 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqs62\" (UniqueName: \"kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62\") pod \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\" (UID: \"7af8592d-ccb5-47d1-9f02-ffac59f8b467\") " Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.478552 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities" (OuterVolumeSpecName: "utilities") pod "7af8592d-ccb5-47d1-9f02-ffac59f8b467" (UID: "7af8592d-ccb5-47d1-9f02-ffac59f8b467"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.482813 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62" (OuterVolumeSpecName: "kube-api-access-qqs62") pod "7af8592d-ccb5-47d1-9f02-ffac59f8b467" (UID: "7af8592d-ccb5-47d1-9f02-ffac59f8b467"). InnerVolumeSpecName "kube-api-access-qqs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.551655 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7af8592d-ccb5-47d1-9f02-ffac59f8b467" (UID: "7af8592d-ccb5-47d1-9f02-ffac59f8b467"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.579417 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.579446 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af8592d-ccb5-47d1-9f02-ffac59f8b467-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:23 crc kubenswrapper[4832]: I1003 00:10:23.579457 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqs62\" (UniqueName: \"kubernetes.io/projected/7af8592d-ccb5-47d1-9f02-ffac59f8b467-kube-api-access-qqs62\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.228998 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc7px" event={"ID":"7af8592d-ccb5-47d1-9f02-ffac59f8b467","Type":"ContainerDied","Data":"502be34ac64e8e927e09370f86b6e8becfe0ab6de9442f0f0c571be7bcbdeb5b"} Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.229065 4832 scope.go:117] "RemoveContainer" containerID="b34ca52dc0bc12b3e1deda133a0ad0990041099c97565acb489aa799b80a5b56" Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.229097 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc7px" Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.249261 4832 scope.go:117] "RemoveContainer" containerID="eb5ff900930503b1f77b21f28af1b084dc0d107f1fbffcabc575f1adbb302cbb" Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.267970 4832 scope.go:117] "RemoveContainer" containerID="fe3d3c2378be42897840528b565eca1ece459339ea7046e05847c42cf47c2972" Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.292528 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:10:24 crc kubenswrapper[4832]: I1003 00:10:24.296626 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lc7px"] Oct 03 00:10:25 crc kubenswrapper[4832]: I1003 00:10:25.393617 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" path="/var/lib/kubelet/pods/7af8592d-ccb5-47d1-9f02-ffac59f8b467/volumes" Oct 03 00:10:25 crc kubenswrapper[4832]: I1003 00:10:25.809650 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:10:25 crc kubenswrapper[4832]: I1003 00:10:25.809707 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:10:25 crc kubenswrapper[4832]: I1003 00:10:25.844172 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.215465 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.215713 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.256825 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.289269 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.299213 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.397475 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.397532 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.397574 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.398010 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.398110 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5" gracePeriod=600 Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.424195 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.424823 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:26 crc kubenswrapper[4832]: I1003 00:10:26.465576 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.258708 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5" exitCode=0 Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.258815 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5"} Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.259449 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e"} Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.305528 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.866345 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.866419 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:10:27 crc kubenswrapper[4832]: I1003 00:10:27.905941 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.100978 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.217698 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.217790 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.259546 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.264362 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dv2sr" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="registry-server" containerID="cri-o://08b1bd359494f6e779f801c60e27af828b455cdb19f62f107465789350306ed3" gracePeriod=2 Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.307623 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.309797 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:28 crc kubenswrapper[4832]: I1003 00:10:28.699972 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:10:29 crc kubenswrapper[4832]: I1003 00:10:29.271361 4832 generic.go:334] "Generic (PLEG): container finished" podID="13a851af-b472-4647-8093-8b0cd577b21a" containerID="08b1bd359494f6e779f801c60e27af828b455cdb19f62f107465789350306ed3" exitCode=0 Oct 03 00:10:29 crc kubenswrapper[4832]: I1003 00:10:29.272471 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerDied","Data":"08b1bd359494f6e779f801c60e27af828b455cdb19f62f107465789350306ed3"} Oct 03 00:10:29 crc kubenswrapper[4832]: I1003 00:10:29.913078 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.065017 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f22mv\" (UniqueName: \"kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv\") pod \"13a851af-b472-4647-8093-8b0cd577b21a\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.065083 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content\") pod \"13a851af-b472-4647-8093-8b0cd577b21a\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.065126 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities\") pod \"13a851af-b472-4647-8093-8b0cd577b21a\" (UID: \"13a851af-b472-4647-8093-8b0cd577b21a\") " Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.066165 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities" (OuterVolumeSpecName: "utilities") pod "13a851af-b472-4647-8093-8b0cd577b21a" (UID: "13a851af-b472-4647-8093-8b0cd577b21a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.073309 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv" (OuterVolumeSpecName: "kube-api-access-f22mv") pod "13a851af-b472-4647-8093-8b0cd577b21a" (UID: "13a851af-b472-4647-8093-8b0cd577b21a"). InnerVolumeSpecName "kube-api-access-f22mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.117271 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a851af-b472-4647-8093-8b0cd577b21a" (UID: "13a851af-b472-4647-8093-8b0cd577b21a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.166859 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f22mv\" (UniqueName: \"kubernetes.io/projected/13a851af-b472-4647-8093-8b0cd577b21a-kube-api-access-f22mv\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.166901 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.166911 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a851af-b472-4647-8093-8b0cd577b21a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.289366 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dv2sr" event={"ID":"13a851af-b472-4647-8093-8b0cd577b21a","Type":"ContainerDied","Data":"d4304f6f0a5fdcac3fffe0eb904551091ef57cc3e61ef6f248d1400bd6e77bf0"} Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.289410 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dv2sr" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.289451 4832 scope.go:117] "RemoveContainer" containerID="08b1bd359494f6e779f801c60e27af828b455cdb19f62f107465789350306ed3" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.289534 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5wfsk" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="registry-server" containerID="cri-o://f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7" gracePeriod=2 Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.313834 4832 scope.go:117] "RemoveContainer" containerID="b896c9e938a8a43b0393a0ae5b8dd56b10cfc62f8b17b098f963774631f7f963" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.318074 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.321941 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dv2sr"] Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.343958 4832 scope.go:117] "RemoveContainer" containerID="73ebff37ae24d4ae4ad91cfe5e6f84ee746fd7e5dcb4d80a65be7e3a960e7f58" Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.500675 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:10:30 crc kubenswrapper[4832]: I1003 00:10:30.500914 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4596m" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="registry-server" containerID="cri-o://4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0" gracePeriod=2 Oct 03 00:10:31 crc kubenswrapper[4832]: I1003 00:10:31.393124 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a851af-b472-4647-8093-8b0cd577b21a" path="/var/lib/kubelet/pods/13a851af-b472-4647-8093-8b0cd577b21a/volumes" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.092788 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.126160 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.193855 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities\") pod \"549600c0-8e76-4369-bcf8-8d0176082edc\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.193973 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-297jq\" (UniqueName: \"kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq\") pod \"549600c0-8e76-4369-bcf8-8d0176082edc\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.194002 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content\") pod \"549600c0-8e76-4369-bcf8-8d0176082edc\" (UID: \"549600c0-8e76-4369-bcf8-8d0176082edc\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.194918 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities" (OuterVolumeSpecName: "utilities") pod "549600c0-8e76-4369-bcf8-8d0176082edc" (UID: "549600c0-8e76-4369-bcf8-8d0176082edc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.200126 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq" (OuterVolumeSpecName: "kube-api-access-297jq") pod "549600c0-8e76-4369-bcf8-8d0176082edc" (UID: "549600c0-8e76-4369-bcf8-8d0176082edc"). InnerVolumeSpecName "kube-api-access-297jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.205630 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "549600c0-8e76-4369-bcf8-8d0176082edc" (UID: "549600c0-8e76-4369-bcf8-8d0176082edc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.295495 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content\") pod \"bad1920b-7616-41f9-b2f9-84b3073faede\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.295569 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lxtj\" (UniqueName: \"kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj\") pod \"bad1920b-7616-41f9-b2f9-84b3073faede\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.295646 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities\") pod \"bad1920b-7616-41f9-b2f9-84b3073faede\" (UID: \"bad1920b-7616-41f9-b2f9-84b3073faede\") " Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.295974 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.296003 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549600c0-8e76-4369-bcf8-8d0176082edc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.296015 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-297jq\" (UniqueName: \"kubernetes.io/projected/549600c0-8e76-4369-bcf8-8d0176082edc-kube-api-access-297jq\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.296571 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities" (OuterVolumeSpecName: "utilities") pod "bad1920b-7616-41f9-b2f9-84b3073faede" (UID: "bad1920b-7616-41f9-b2f9-84b3073faede"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.298793 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj" (OuterVolumeSpecName: "kube-api-access-2lxtj") pod "bad1920b-7616-41f9-b2f9-84b3073faede" (UID: "bad1920b-7616-41f9-b2f9-84b3073faede"). InnerVolumeSpecName "kube-api-access-2lxtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.304429 4832 generic.go:334] "Generic (PLEG): container finished" podID="bad1920b-7616-41f9-b2f9-84b3073faede" containerID="f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7" exitCode=0 Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.304474 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerDied","Data":"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7"} Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.304517 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wfsk" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.304532 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wfsk" event={"ID":"bad1920b-7616-41f9-b2f9-84b3073faede","Type":"ContainerDied","Data":"95ab04db99e80add450c9ea4f79641c0ce26656f6c763dfa9ff5d48f2a2c1d09"} Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.304556 4832 scope.go:117] "RemoveContainer" containerID="f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.311128 4832 generic.go:334] "Generic (PLEG): container finished" podID="549600c0-8e76-4369-bcf8-8d0176082edc" containerID="4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0" exitCode=0 Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.311179 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerDied","Data":"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0"} Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.311210 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4596m" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.311226 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4596m" event={"ID":"549600c0-8e76-4369-bcf8-8d0176082edc","Type":"ContainerDied","Data":"bc1cf62dc334468735ea8fb70754d64b177db033bb0c48975860d51c5c720930"} Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.326001 4832 scope.go:117] "RemoveContainer" containerID="7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.335541 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.339398 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4596m"] Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.348325 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad1920b-7616-41f9-b2f9-84b3073faede" (UID: "bad1920b-7616-41f9-b2f9-84b3073faede"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.360980 4832 scope.go:117] "RemoveContainer" containerID="bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.372635 4832 scope.go:117] "RemoveContainer" containerID="f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.373038 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7\": container with ID starting with f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7 not found: ID does not exist" containerID="f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.373102 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7"} err="failed to get container status \"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7\": rpc error: code = NotFound desc = could not find container \"f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7\": container with ID starting with f6f5477ef1a4c5e55c48a26a213c9aeb1bc020e393e0588400e069d8f6b3eaa7 not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.373152 4832 scope.go:117] "RemoveContainer" containerID="7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.373619 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f\": container with ID starting with 7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f not found: ID does not exist" containerID="7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.373662 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f"} err="failed to get container status \"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f\": rpc error: code = NotFound desc = could not find container \"7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f\": container with ID starting with 7e95472db3cd92ba28d459a445f97687dff54f2233f33021ca6832de0293708f not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.373696 4832 scope.go:117] "RemoveContainer" containerID="bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.374242 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6\": container with ID starting with bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6 not found: ID does not exist" containerID="bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.374278 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6"} err="failed to get container status \"bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6\": rpc error: code = NotFound desc = could not find container \"bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6\": container with ID starting with bd9aa6c4c9a23d79eb81db89fe515fb9a12b0a428af76a4577d3ea0cd9f1e8e6 not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.374300 4832 scope.go:117] "RemoveContainer" containerID="4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.384479 4832 scope.go:117] "RemoveContainer" containerID="dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.397095 4832 scope.go:117] "RemoveContainer" containerID="a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.398505 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.398543 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad1920b-7616-41f9-b2f9-84b3073faede-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.398554 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lxtj\" (UniqueName: \"kubernetes.io/projected/bad1920b-7616-41f9-b2f9-84b3073faede-kube-api-access-2lxtj\") on node \"crc\" DevicePath \"\"" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.408874 4832 scope.go:117] "RemoveContainer" containerID="4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.409334 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0\": container with ID starting with 4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0 not found: ID does not exist" containerID="4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.409391 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0"} err="failed to get container status \"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0\": rpc error: code = NotFound desc = could not find container \"4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0\": container with ID starting with 4b285bf646bef6bd2db3583072c5d6bb3beaa44ed1c380aa50a223014ef5f7f0 not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.409416 4832 scope.go:117] "RemoveContainer" containerID="dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.409795 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf\": container with ID starting with dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf not found: ID does not exist" containerID="dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.409827 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf"} err="failed to get container status \"dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf\": rpc error: code = NotFound desc = could not find container \"dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf\": container with ID starting with dc7ac546b1574b0b4a3dd26365f018f404781df932fb7804e0a0328e8c861fbf not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.409854 4832 scope.go:117] "RemoveContainer" containerID="a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566" Oct 03 00:10:32 crc kubenswrapper[4832]: E1003 00:10:32.410241 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566\": container with ID starting with a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566 not found: ID does not exist" containerID="a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.410277 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566"} err="failed to get container status \"a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566\": rpc error: code = NotFound desc = could not find container \"a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566\": container with ID starting with a9ce9ab17aa2c24f0982c9482d76789089a1574da84d8c833d06a8ac7dbec566 not found: ID does not exist" Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.631600 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:10:32 crc kubenswrapper[4832]: I1003 00:10:32.634233 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5wfsk"] Oct 03 00:10:33 crc kubenswrapper[4832]: I1003 00:10:33.393481 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" path="/var/lib/kubelet/pods/549600c0-8e76-4369-bcf8-8d0176082edc/volumes" Oct 03 00:10:33 crc kubenswrapper[4832]: I1003 00:10:33.394227 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" path="/var/lib/kubelet/pods/bad1920b-7616-41f9-b2f9-84b3073faede/volumes" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.805201 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.806108 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9lmn" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="registry-server" containerID="cri-o://fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9" gracePeriod=30 Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.811766 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.811999 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lhnfr" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="registry-server" containerID="cri-o://758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be" gracePeriod=30 Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.821020 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.821536 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" containerID="cri-o://b1ce2fb08cf3bbf936ab7af9d29562d673c5017fcb208edefa3025e6c88581a6" gracePeriod=30 Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.832329 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.832611 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28hml" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="registry-server" containerID="cri-o://27d909a22d477b03dcbbd948ad95d924e6bf452aa738d6e3935898779acf57b1" gracePeriod=30 Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846372 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6jnq4"] Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846607 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846626 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846645 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846654 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846666 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" containerName="pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846673 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" containerName="pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846682 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846689 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846702 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846710 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846722 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846729 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846738 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846745 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846776 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" containerName="image-pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846783 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" containerName="image-pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846793 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846799 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846809 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846816 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846828 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846836 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846846 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846853 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846862 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846869 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="extract-utilities" Oct 03 00:11:51 crc kubenswrapper[4832]: E1003 00:11:51.846877 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846883 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="extract-content" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.846989 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="549600c0-8e76-4369-bcf8-8d0176082edc" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847003 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae7d3f9-d39c-40d8-9fd5-ce2bf7d5d217" containerName="pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847014 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af8592d-ccb5-47d1-9f02-ffac59f8b467" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847025 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a851af-b472-4647-8093-8b0cd577b21a" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847035 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad1920b-7616-41f9-b2f9-84b3073faede" containerName="registry-server" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847045 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="3335886b-58ec-44d4-9b6a-b7d81b4aa3fe" containerName="image-pruner" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.847471 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.849406 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.849597 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-46d2w" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="registry-server" containerID="cri-o://6ee1a34ad047636439552083715a9e26012d5619b3bf95b5fdfa309e942dd909" gracePeriod=30 Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.864916 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6jnq4"] Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.918930 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsk94\" (UniqueName: \"kubernetes.io/projected/547c7572-9eeb-4437-a528-2b1cc5eb573b-kube-api-access-wsk94\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.919019 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:51 crc kubenswrapper[4832]: I1003 00:11:51.919135 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.020120 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.020210 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsk94\" (UniqueName: \"kubernetes.io/projected/547c7572-9eeb-4437-a528-2b1cc5eb573b-kube-api-access-wsk94\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.020299 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.021925 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.028137 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/547c7572-9eeb-4437-a528-2b1cc5eb573b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.039381 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsk94\" (UniqueName: \"kubernetes.io/projected/547c7572-9eeb-4437-a528-2b1cc5eb573b-kube-api-access-wsk94\") pod \"marketplace-operator-79b997595-6jnq4\" (UID: \"547c7572-9eeb-4437-a528-2b1cc5eb573b\") " pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.165210 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.273473 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.427862 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities\") pod \"93fca13a-c8e1-4239-b419-e4055be395cd\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.427953 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5zgb\" (UniqueName: \"kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb\") pod \"93fca13a-c8e1-4239-b419-e4055be395cd\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.428016 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content\") pod \"93fca13a-c8e1-4239-b419-e4055be395cd\" (UID: \"93fca13a-c8e1-4239-b419-e4055be395cd\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.428798 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities" (OuterVolumeSpecName: "utilities") pod "93fca13a-c8e1-4239-b419-e4055be395cd" (UID: "93fca13a-c8e1-4239-b419-e4055be395cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.431622 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb" (OuterVolumeSpecName: "kube-api-access-g5zgb") pod "93fca13a-c8e1-4239-b419-e4055be395cd" (UID: "93fca13a-c8e1-4239-b419-e4055be395cd"). InnerVolumeSpecName "kube-api-access-g5zgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.472226 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93fca13a-c8e1-4239-b419-e4055be395cd" (UID: "93fca13a-c8e1-4239-b419-e4055be395cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.534241 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.534277 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fca13a-c8e1-4239-b419-e4055be395cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.534291 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5zgb\" (UniqueName: \"kubernetes.io/projected/93fca13a-c8e1-4239-b419-e4055be395cd-kube-api-access-g5zgb\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.540962 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6jnq4"] Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.686783 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.690891 4832 generic.go:334] "Generic (PLEG): container finished" podID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerID="6ee1a34ad047636439552083715a9e26012d5619b3bf95b5fdfa309e942dd909" exitCode=0 Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.690964 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerDied","Data":"6ee1a34ad047636439552083715a9e26012d5619b3bf95b5fdfa309e942dd909"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.692660 4832 generic.go:334] "Generic (PLEG): container finished" podID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerID="27d909a22d477b03dcbbd948ad95d924e6bf452aa738d6e3935898779acf57b1" exitCode=0 Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.692719 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerDied","Data":"27d909a22d477b03dcbbd948ad95d924e6bf452aa738d6e3935898779acf57b1"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.694394 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" event={"ID":"547c7572-9eeb-4437-a528-2b1cc5eb573b","Type":"ContainerStarted","Data":"a109d240a634915940ea04ee9b3e8bbb6647e3e62f8252e56ef5ad5c735268f7"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.696292 4832 generic.go:334] "Generic (PLEG): container finished" podID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerID="758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be" exitCode=0 Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.696376 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerDied","Data":"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.696399 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhnfr" event={"ID":"bbb2ac77-f639-41e8-897d-7744dd4e987d","Type":"ContainerDied","Data":"a160fc41a210d7d74d7ca391fdc9fbca2dd10509759cfcb83ca6e55ab68242a0"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.696459 4832 scope.go:117] "RemoveContainer" containerID="758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.696642 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhnfr" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.700367 4832 generic.go:334] "Generic (PLEG): container finished" podID="8ac12ff5-1646-45c2-822c-7741b913d805" containerID="b1ce2fb08cf3bbf936ab7af9d29562d673c5017fcb208edefa3025e6c88581a6" exitCode=0 Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.700433 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" event={"ID":"8ac12ff5-1646-45c2-822c-7741b913d805","Type":"ContainerDied","Data":"b1ce2fb08cf3bbf936ab7af9d29562d673c5017fcb208edefa3025e6c88581a6"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.705542 4832 generic.go:334] "Generic (PLEG): container finished" podID="93fca13a-c8e1-4239-b419-e4055be395cd" containerID="fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9" exitCode=0 Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.705589 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerDied","Data":"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.705616 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9lmn" event={"ID":"93fca13a-c8e1-4239-b419-e4055be395cd","Type":"ContainerDied","Data":"a2b84add0e4e942439bbc953a70702f5a9418a464e6b5ff604633054e4887e74"} Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.705684 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9lmn" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.726669 4832 scope.go:117] "RemoveContainer" containerID="6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.745001 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.748633 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j9lmn"] Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.761959 4832 scope.go:117] "RemoveContainer" containerID="47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.763625 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.770192 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.777162 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.777426 4832 scope.go:117] "RemoveContainer" containerID="758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.778021 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be\": container with ID starting with 758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be not found: ID does not exist" containerID="758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778064 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be"} err="failed to get container status \"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be\": rpc error: code = NotFound desc = could not find container \"758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be\": container with ID starting with 758bd1e89a2b4fba3a0850c3659e0903fe382783b207f84d5456f817169798be not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778093 4832 scope.go:117] "RemoveContainer" containerID="6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.778349 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e\": container with ID starting with 6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e not found: ID does not exist" containerID="6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778385 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e"} err="failed to get container status \"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e\": rpc error: code = NotFound desc = could not find container \"6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e\": container with ID starting with 6491ccdc6a1d5004158059ca509f682a55b04b2a4c710fd26ee4359f594d051e not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778412 4832 scope.go:117] "RemoveContainer" containerID="47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.778640 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4\": container with ID starting with 47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4 not found: ID does not exist" containerID="47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778666 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4"} err="failed to get container status \"47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4\": rpc error: code = NotFound desc = could not find container \"47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4\": container with ID starting with 47b5929020b3c6cd6c804a876b927119cd721463907674eb95c730b8394edae4 not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.778682 4832 scope.go:117] "RemoveContainer" containerID="fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.807306 4832 scope.go:117] "RemoveContainer" containerID="8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.829850 4832 scope.go:117] "RemoveContainer" containerID="2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.839184 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content\") pod \"bbb2ac77-f639-41e8-897d-7744dd4e987d\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.839239 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd2dt\" (UniqueName: \"kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt\") pod \"bbb2ac77-f639-41e8-897d-7744dd4e987d\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.841272 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities\") pod \"bbb2ac77-f639-41e8-897d-7744dd4e987d\" (UID: \"bbb2ac77-f639-41e8-897d-7744dd4e987d\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.843340 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt" (OuterVolumeSpecName: "kube-api-access-xd2dt") pod "bbb2ac77-f639-41e8-897d-7744dd4e987d" (UID: "bbb2ac77-f639-41e8-897d-7744dd4e987d"). InnerVolumeSpecName "kube-api-access-xd2dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.843664 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities" (OuterVolumeSpecName: "utilities") pod "bbb2ac77-f639-41e8-897d-7744dd4e987d" (UID: "bbb2ac77-f639-41e8-897d-7744dd4e987d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.854134 4832 scope.go:117] "RemoveContainer" containerID="fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.854995 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9\": container with ID starting with fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9 not found: ID does not exist" containerID="fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.855064 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9"} err="failed to get container status \"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9\": rpc error: code = NotFound desc = could not find container \"fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9\": container with ID starting with fe1e6b4f69f22cf628aa168c8df5abf88af1530a722fb8fda3693973d4884ae9 not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.855086 4832 scope.go:117] "RemoveContainer" containerID="8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.855403 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431\": container with ID starting with 8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431 not found: ID does not exist" containerID="8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.855439 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431"} err="failed to get container status \"8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431\": rpc error: code = NotFound desc = could not find container \"8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431\": container with ID starting with 8ff0a793c1a08512d57d2c179e5630317475e44c1a35cfc868281716e8bdc431 not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.855464 4832 scope.go:117] "RemoveContainer" containerID="2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae" Oct 03 00:11:52 crc kubenswrapper[4832]: E1003 00:11:52.855823 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae\": container with ID starting with 2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae not found: ID does not exist" containerID="2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.855862 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae"} err="failed to get container status \"2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae\": rpc error: code = NotFound desc = could not find container \"2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae\": container with ID starting with 2db2d077d2069306a71e8a5030ac7c18796f9b014aec88318638d2e3ed6931ae not found: ID does not exist" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.909851 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbb2ac77-f639-41e8-897d-7744dd4e987d" (UID: "bbb2ac77-f639-41e8-897d-7744dd4e987d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.942896 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzpx\" (UniqueName: \"kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx\") pod \"99b27c40-f8ce-41a0-befc-18366ba6770b\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.942951 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities\") pod \"99b27c40-f8ce-41a0-befc-18366ba6770b\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.942986 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics\") pod \"8ac12ff5-1646-45c2-822c-7741b913d805\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943001 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content\") pod \"99b27c40-f8ce-41a0-befc-18366ba6770b\" (UID: \"99b27c40-f8ce-41a0-befc-18366ba6770b\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943037 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkffr\" (UniqueName: \"kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr\") pod \"137f996e-6dba-4fce-b6f4-650e9c96f006\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943067 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj\") pod \"8ac12ff5-1646-45c2-822c-7741b913d805\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943105 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca\") pod \"8ac12ff5-1646-45c2-822c-7741b913d805\" (UID: \"8ac12ff5-1646-45c2-822c-7741b913d805\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943127 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content\") pod \"137f996e-6dba-4fce-b6f4-650e9c96f006\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943148 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities\") pod \"137f996e-6dba-4fce-b6f4-650e9c96f006\" (UID: \"137f996e-6dba-4fce-b6f4-650e9c96f006\") " Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943342 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943357 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbb2ac77-f639-41e8-897d-7744dd4e987d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.943367 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd2dt\" (UniqueName: \"kubernetes.io/projected/bbb2ac77-f639-41e8-897d-7744dd4e987d-kube-api-access-xd2dt\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.944059 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities" (OuterVolumeSpecName: "utilities") pod "137f996e-6dba-4fce-b6f4-650e9c96f006" (UID: "137f996e-6dba-4fce-b6f4-650e9c96f006"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.944333 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities" (OuterVolumeSpecName: "utilities") pod "99b27c40-f8ce-41a0-befc-18366ba6770b" (UID: "99b27c40-f8ce-41a0-befc-18366ba6770b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.944490 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8ac12ff5-1646-45c2-822c-7741b913d805" (UID: "8ac12ff5-1646-45c2-822c-7741b913d805"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.946347 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj" (OuterVolumeSpecName: "kube-api-access-7j7nj") pod "8ac12ff5-1646-45c2-822c-7741b913d805" (UID: "8ac12ff5-1646-45c2-822c-7741b913d805"). InnerVolumeSpecName "kube-api-access-7j7nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.947110 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx" (OuterVolumeSpecName: "kube-api-access-vfzpx") pod "99b27c40-f8ce-41a0-befc-18366ba6770b" (UID: "99b27c40-f8ce-41a0-befc-18366ba6770b"). InnerVolumeSpecName "kube-api-access-vfzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.947490 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr" (OuterVolumeSpecName: "kube-api-access-kkffr") pod "137f996e-6dba-4fce-b6f4-650e9c96f006" (UID: "137f996e-6dba-4fce-b6f4-650e9c96f006"). InnerVolumeSpecName "kube-api-access-kkffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.954640 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8ac12ff5-1646-45c2-822c-7741b913d805" (UID: "8ac12ff5-1646-45c2-822c-7741b913d805"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:11:52 crc kubenswrapper[4832]: I1003 00:11:52.963643 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99b27c40-f8ce-41a0-befc-18366ba6770b" (UID: "99b27c40-f8ce-41a0-befc-18366ba6770b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.020980 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.023762 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lhnfr"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.034035 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "137f996e-6dba-4fce-b6f4-650e9c96f006" (UID: "137f996e-6dba-4fce-b6f4-650e9c96f006"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.044980 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045033 4832 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045043 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b27c40-f8ce-41a0-befc-18366ba6770b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045055 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkffr\" (UniqueName: \"kubernetes.io/projected/137f996e-6dba-4fce-b6f4-650e9c96f006-kube-api-access-kkffr\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045064 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/8ac12ff5-1646-45c2-822c-7741b913d805-kube-api-access-7j7nj\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045073 4832 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ac12ff5-1646-45c2-822c-7741b913d805-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045081 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045089 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137f996e-6dba-4fce-b6f4-650e9c96f006-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.045098 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzpx\" (UniqueName: \"kubernetes.io/projected/99b27c40-f8ce-41a0-befc-18366ba6770b-kube-api-access-vfzpx\") on node \"crc\" DevicePath \"\"" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.267491 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.392920 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" path="/var/lib/kubelet/pods/93fca13a-c8e1-4239-b419-e4055be395cd/volumes" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.393778 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" path="/var/lib/kubelet/pods/bbb2ac77-f639-41e8-897d-7744dd4e987d/volumes" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.711573 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" event={"ID":"547c7572-9eeb-4437-a528-2b1cc5eb573b","Type":"ContainerStarted","Data":"5e70fc6b2ad8a8a321c8c94f996b25994c9a10122d339107c709b853faff5227"} Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.711805 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.713735 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" event={"ID":"8ac12ff5-1646-45c2-822c-7741b913d805","Type":"ContainerDied","Data":"5591ff63650c4cc3b7aeaaf29ff049fa6f3bdf5ee448704fdcb4d93b5353cc68"} Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.713809 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wbjmp" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.713828 4832 scope.go:117] "RemoveContainer" containerID="b1ce2fb08cf3bbf936ab7af9d29562d673c5017fcb208edefa3025e6c88581a6" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.717386 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46d2w" event={"ID":"137f996e-6dba-4fce-b6f4-650e9c96f006","Type":"ContainerDied","Data":"e2914fcd2e30098d364fe276059547f95e7f11f836e8e4601c87291e647c14f6"} Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.717456 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46d2w" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.718424 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.719922 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28hml" event={"ID":"99b27c40-f8ce-41a0-befc-18366ba6770b","Type":"ContainerDied","Data":"f4ba2008b67e2cf4884380503313b95611c76fb8c023e1001173efbb3021d777"} Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.720013 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28hml" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.728122 4832 scope.go:117] "RemoveContainer" containerID="6ee1a34ad047636439552083715a9e26012d5619b3bf95b5fdfa309e942dd909" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.730244 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6jnq4" podStartSLOduration=2.730210285 podStartE2EDuration="2.730210285s" podCreationTimestamp="2025-10-03 00:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:11:53.72740219 +0000 UTC m=+306.989557296" watchObservedRunningTime="2025-10-03 00:11:53.730210285 +0000 UTC m=+306.992365381" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.740045 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.743626 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-46d2w"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.750664 4832 scope.go:117] "RemoveContainer" containerID="7087abea0470ec63e9b19c89006ea2d6c130ca8dc732a366288bc8a233e33b96" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.752439 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.756437 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wbjmp"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.767928 4832 scope.go:117] "RemoveContainer" containerID="95e4e53f36635761f69260037c83ff451793827f13a28db826b99a81c2884cb9" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.781817 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.785669 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28hml"] Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.796670 4832 scope.go:117] "RemoveContainer" containerID="27d909a22d477b03dcbbd948ad95d924e6bf452aa738d6e3935898779acf57b1" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.813953 4832 scope.go:117] "RemoveContainer" containerID="88d8364072fbe04131882db71556d56e42b7e320648f0032fe1636e23ba80493" Oct 03 00:11:53 crc kubenswrapper[4832]: I1003 00:11:53.826286 4832 scope.go:117] "RemoveContainer" containerID="18c8dd743b8f3e796a694e45bc47f03a1f6f5d10bd51898422bf931113c59fde" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226089 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gk9cm"] Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226340 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226360 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226375 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226382 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226394 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226403 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226442 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226449 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226457 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226465 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226475 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226482 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226494 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226501 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226512 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226521 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226533 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226540 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226552 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226559 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226576 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226584 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226593 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226601 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="extract-utilities" Oct 03 00:11:54 crc kubenswrapper[4832]: E1003 00:11:54.226613 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226626 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="extract-content" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226775 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" containerName="marketplace-operator" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226790 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb2ac77-f639-41e8-897d-7744dd4e987d" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226825 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226833 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fca13a-c8e1-4239-b419-e4055be395cd" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.226842 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" containerName="registry-server" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.227726 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.230904 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.237492 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk9cm"] Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.360474 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-utilities\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.360543 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-catalog-content\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.360576 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq4mx\" (UniqueName: \"kubernetes.io/projected/8246614a-4293-4792-9a59-dafd437ccd03-kube-api-access-sq4mx\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.462102 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-utilities\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.462519 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-utilities\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.462598 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-catalog-content\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.462877 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8246614a-4293-4792-9a59-dafd437ccd03-catalog-content\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.462944 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq4mx\" (UniqueName: \"kubernetes.io/projected/8246614a-4293-4792-9a59-dafd437ccd03-kube-api-access-sq4mx\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.479901 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq4mx\" (UniqueName: \"kubernetes.io/projected/8246614a-4293-4792-9a59-dafd437ccd03-kube-api-access-sq4mx\") pod \"redhat-operators-gk9cm\" (UID: \"8246614a-4293-4792-9a59-dafd437ccd03\") " pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.548896 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:11:54 crc kubenswrapper[4832]: I1003 00:11:54.748136 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk9cm"] Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.393694 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137f996e-6dba-4fce-b6f4-650e9c96f006" path="/var/lib/kubelet/pods/137f996e-6dba-4fce-b6f4-650e9c96f006/volumes" Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.394929 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac12ff5-1646-45c2-822c-7741b913d805" path="/var/lib/kubelet/pods/8ac12ff5-1646-45c2-822c-7741b913d805/volumes" Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.395497 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b27c40-f8ce-41a0-befc-18366ba6770b" path="/var/lib/kubelet/pods/99b27c40-f8ce-41a0-befc-18366ba6770b/volumes" Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.731054 4832 generic.go:334] "Generic (PLEG): container finished" podID="8246614a-4293-4792-9a59-dafd437ccd03" containerID="2de4a0623866b55f483c1b99e913bed70d31d6268b18110eafa5b35f99f954f0" exitCode=0 Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.731139 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk9cm" event={"ID":"8246614a-4293-4792-9a59-dafd437ccd03","Type":"ContainerDied","Data":"2de4a0623866b55f483c1b99e913bed70d31d6268b18110eafa5b35f99f954f0"} Oct 03 00:11:55 crc kubenswrapper[4832]: I1003 00:11:55.731849 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk9cm" event={"ID":"8246614a-4293-4792-9a59-dafd437ccd03","Type":"ContainerStarted","Data":"6c1493bc72ba6f751ac85573e0981828ade45b22a155f055deb2f4294d4d4f28"} Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.021146 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gxnkl"] Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.022089 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.023688 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.030288 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxnkl"] Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.182967 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-catalog-content\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.183502 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74c5f\" (UniqueName: \"kubernetes.io/projected/c9594493-a290-4381-8724-3037db3c8c26-kube-api-access-74c5f\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.183629 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-utilities\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.285131 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-utilities\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.285217 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-catalog-content\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.285543 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74c5f\" (UniqueName: \"kubernetes.io/projected/c9594493-a290-4381-8724-3037db3c8c26-kube-api-access-74c5f\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.285800 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-utilities\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.285813 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9594493-a290-4381-8724-3037db3c8c26-catalog-content\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.306151 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74c5f\" (UniqueName: \"kubernetes.io/projected/c9594493-a290-4381-8724-3037db3c8c26-kube-api-access-74c5f\") pod \"certified-operators-gxnkl\" (UID: \"c9594493-a290-4381-8724-3037db3c8c26\") " pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.344642 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.515384 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxnkl"] Oct 03 00:11:56 crc kubenswrapper[4832]: W1003 00:11:56.521967 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9594493_a290_4381_8724_3037db3c8c26.slice/crio-fe95ebd62216dc12687cb01daaeb2a37fe1ca1faddad23e2361fecdcb5f34216 WatchSource:0}: Error finding container fe95ebd62216dc12687cb01daaeb2a37fe1ca1faddad23e2361fecdcb5f34216: Status 404 returned error can't find the container with id fe95ebd62216dc12687cb01daaeb2a37fe1ca1faddad23e2361fecdcb5f34216 Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.622745 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.624455 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.627585 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.630822 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.737508 4832 generic.go:334] "Generic (PLEG): container finished" podID="c9594493-a290-4381-8724-3037db3c8c26" containerID="44ab958e4a3d6f98d917cfbf2fd20ab6f6d666d595fecdd784677857af38162a" exitCode=0 Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.737546 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxnkl" event={"ID":"c9594493-a290-4381-8724-3037db3c8c26","Type":"ContainerDied","Data":"44ab958e4a3d6f98d917cfbf2fd20ab6f6d666d595fecdd784677857af38162a"} Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.737568 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxnkl" event={"ID":"c9594493-a290-4381-8724-3037db3c8c26","Type":"ContainerStarted","Data":"fe95ebd62216dc12687cb01daaeb2a37fe1ca1faddad23e2361fecdcb5f34216"} Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.792088 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7kw2\" (UniqueName: \"kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.792210 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.792234 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.893672 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.893713 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.893795 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7kw2\" (UniqueName: \"kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.894255 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.894281 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.912107 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7kw2\" (UniqueName: \"kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2\") pod \"community-operators-f7556\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:56 crc kubenswrapper[4832]: I1003 00:11:56.945744 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.361501 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.744481 4832 generic.go:334] "Generic (PLEG): container finished" podID="8246614a-4293-4792-9a59-dafd437ccd03" containerID="4e23ea78451c52f951bfb2c6c7ea4c6e01cfd1d607a6e648676ce6a700c1e4d8" exitCode=0 Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.744577 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk9cm" event={"ID":"8246614a-4293-4792-9a59-dafd437ccd03","Type":"ContainerDied","Data":"4e23ea78451c52f951bfb2c6c7ea4c6e01cfd1d607a6e648676ce6a700c1e4d8"} Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.745989 4832 generic.go:334] "Generic (PLEG): container finished" podID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerID="2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2" exitCode=0 Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.746036 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerDied","Data":"2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2"} Oct 03 00:11:57 crc kubenswrapper[4832]: I1003 00:11:57.746069 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerStarted","Data":"f4fa208eee473803b2cf95fc0dda1a857b7d71ca176e8633a687ba18e004d6b8"} Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.423927 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.425268 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.426959 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.432293 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.513143 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2mk\" (UniqueName: \"kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.513207 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.513270 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.614206 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2mk\" (UniqueName: \"kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.614278 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.614320 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.614868 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.614986 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.631852 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2mk\" (UniqueName: \"kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk\") pod \"redhat-marketplace-f9csr\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.746795 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.752176 4832 generic.go:334] "Generic (PLEG): container finished" podID="c9594493-a290-4381-8724-3037db3c8c26" containerID="9b5c3c4d11f0a925db12609d35c9a99e2fa1bab1aa89dfdcdb9542dcadfb4fec" exitCode=0 Oct 03 00:11:58 crc kubenswrapper[4832]: I1003 00:11:58.752222 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxnkl" event={"ID":"c9594493-a290-4381-8724-3037db3c8c26","Type":"ContainerDied","Data":"9b5c3c4d11f0a925db12609d35c9a99e2fa1bab1aa89dfdcdb9542dcadfb4fec"} Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.351383 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.757402 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerStarted","Data":"4c8a3d61f2ab3118f38a914b09efdb23cc3d01ef05ed0f64ed799d6ac19f41cc"} Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.759442 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk9cm" event={"ID":"8246614a-4293-4792-9a59-dafd437ccd03","Type":"ContainerStarted","Data":"256b4368cddc2b0de0ebe4aaa95b825a3cb7a624751595a8609f8b0c6df07da1"} Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.762532 4832 generic.go:334] "Generic (PLEG): container finished" podID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerID="bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8" exitCode=0 Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.762555 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerDied","Data":"bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8"} Oct 03 00:11:59 crc kubenswrapper[4832]: I1003 00:11:59.774984 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gk9cm" podStartSLOduration=2.543178873 podStartE2EDuration="5.774969934s" podCreationTimestamp="2025-10-03 00:11:54 +0000 UTC" firstStartedPulling="2025-10-03 00:11:55.733278766 +0000 UTC m=+308.995433872" lastFinishedPulling="2025-10-03 00:11:58.965069827 +0000 UTC m=+312.227224933" observedRunningTime="2025-10-03 00:11:59.773726656 +0000 UTC m=+313.035881782" watchObservedRunningTime="2025-10-03 00:11:59.774969934 +0000 UTC m=+313.037125040" Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.794916 4832 generic.go:334] "Generic (PLEG): container finished" podID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerID="36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1" exitCode=0 Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.795074 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerDied","Data":"36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1"} Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.798229 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxnkl" event={"ID":"c9594493-a290-4381-8724-3037db3c8c26","Type":"ContainerStarted","Data":"5844c9160e8a394e53631072046e8b2e2077628ce2e6d88a89c4f0c710937e3a"} Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.802446 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerStarted","Data":"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0"} Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.832504 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7556" podStartSLOduration=3.29231452 podStartE2EDuration="5.832481253s" podCreationTimestamp="2025-10-03 00:11:56 +0000 UTC" firstStartedPulling="2025-10-03 00:11:57.748831035 +0000 UTC m=+311.010986141" lastFinishedPulling="2025-10-03 00:12:00.288997768 +0000 UTC m=+313.551152874" observedRunningTime="2025-10-03 00:12:01.827629395 +0000 UTC m=+315.089784521" watchObservedRunningTime="2025-10-03 00:12:01.832481253 +0000 UTC m=+315.094636359" Oct 03 00:12:01 crc kubenswrapper[4832]: I1003 00:12:01.846803 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gxnkl" podStartSLOduration=3.135027249 podStartE2EDuration="5.846782935s" podCreationTimestamp="2025-10-03 00:11:56 +0000 UTC" firstStartedPulling="2025-10-03 00:11:56.738506463 +0000 UTC m=+310.000661569" lastFinishedPulling="2025-10-03 00:11:59.450262139 +0000 UTC m=+312.712417255" observedRunningTime="2025-10-03 00:12:01.844734793 +0000 UTC m=+315.106889899" watchObservedRunningTime="2025-10-03 00:12:01.846782935 +0000 UTC m=+315.108938051" Oct 03 00:12:02 crc kubenswrapper[4832]: I1003 00:12:02.819494 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerStarted","Data":"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c"} Oct 03 00:12:03 crc kubenswrapper[4832]: I1003 00:12:03.842844 4832 generic.go:334] "Generic (PLEG): container finished" podID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerID="09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c" exitCode=0 Oct 03 00:12:03 crc kubenswrapper[4832]: I1003 00:12:03.842912 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerDied","Data":"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c"} Oct 03 00:12:04 crc kubenswrapper[4832]: I1003 00:12:04.549734 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:12:04 crc kubenswrapper[4832]: I1003 00:12:04.549808 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:12:04 crc kubenswrapper[4832]: I1003 00:12:04.592393 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:12:04 crc kubenswrapper[4832]: I1003 00:12:04.885456 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gk9cm" Oct 03 00:12:05 crc kubenswrapper[4832]: I1003 00:12:05.856440 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerStarted","Data":"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b"} Oct 03 00:12:05 crc kubenswrapper[4832]: I1003 00:12:05.876413 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f9csr" podStartSLOduration=4.438563621 podStartE2EDuration="7.876397477s" podCreationTimestamp="2025-10-03 00:11:58 +0000 UTC" firstStartedPulling="2025-10-03 00:12:01.796805132 +0000 UTC m=+315.058960248" lastFinishedPulling="2025-10-03 00:12:05.234638998 +0000 UTC m=+318.496794104" observedRunningTime="2025-10-03 00:12:05.874905443 +0000 UTC m=+319.137060559" watchObservedRunningTime="2025-10-03 00:12:05.876397477 +0000 UTC m=+319.138552573" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.345145 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.345203 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.379788 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.894860 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gxnkl" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.945935 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.946008 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:12:06 crc kubenswrapper[4832]: I1003 00:12:06.979171 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:12:07 crc kubenswrapper[4832]: I1003 00:12:07.908941 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:12:08 crc kubenswrapper[4832]: I1003 00:12:08.747671 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:12:08 crc kubenswrapper[4832]: I1003 00:12:08.747726 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:12:08 crc kubenswrapper[4832]: I1003 00:12:08.785266 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.293219 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" containerName="oauth-openshift" containerID="cri-o://a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4" gracePeriod=15 Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.647373 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.683708 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-55xdv"] Oct 03 00:12:18 crc kubenswrapper[4832]: E1003 00:12:18.687907 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" containerName="oauth-openshift" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.687973 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" containerName="oauth-openshift" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.688393 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" containerName="oauth-openshift" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.689143 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.691893 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-55xdv"] Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.756481 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757037 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qndd5\" (UniqueName: \"kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757151 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757238 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757324 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757422 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757506 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757591 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757660 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757739 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757833 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757929 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758019 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758098 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error\") pod \"e422e2ec-57ff-4009-8065-bfa489b34334\" (UID: \"e422e2ec-57ff-4009-8065-bfa489b34334\") " Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758388 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758501 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bng\" (UniqueName: \"kubernetes.io/projected/9542b67b-1582-4547-a94e-53952a29248e-kube-api-access-z2bng\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758582 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758658 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758741 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.757332 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758306 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758349 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.758785 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759081 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759185 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759316 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759399 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759475 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759654 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759763 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9542b67b-1582-4547-a94e-53952a29248e-audit-dir\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759858 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.759952 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-audit-policies\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.760097 4832 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e422e2ec-57ff-4009-8065-bfa489b34334-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.760170 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.760257 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.760339 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.762032 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.762426 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5" (OuterVolumeSpecName: "kube-api-access-qndd5") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "kube-api-access-qndd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.764844 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.765621 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.766007 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.766423 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.766644 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.767236 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.767387 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.774524 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e422e2ec-57ff-4009-8065-bfa489b34334" (UID: "e422e2ec-57ff-4009-8065-bfa489b34334"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.790729 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861403 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bng\" (UniqueName: \"kubernetes.io/projected/9542b67b-1582-4547-a94e-53952a29248e-kube-api-access-z2bng\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861477 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861508 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861535 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861562 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861585 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861606 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861633 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861657 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861688 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861713 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9542b67b-1582-4547-a94e-53952a29248e-audit-dir\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861748 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861796 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-audit-policies\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861832 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861894 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861913 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861931 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861944 4832 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e422e2ec-57ff-4009-8065-bfa489b34334-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861957 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qndd5\" (UniqueName: \"kubernetes.io/projected/e422e2ec-57ff-4009-8065-bfa489b34334-kube-api-access-qndd5\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861971 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861983 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.861997 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862010 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862025 4832 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e422e2ec-57ff-4009-8065-bfa489b34334-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862225 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9542b67b-1582-4547-a94e-53952a29248e-audit-dir\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862697 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862902 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.862935 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-audit-policies\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.863071 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-service-ca\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.864727 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.864919 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-session\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.865173 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-router-certs\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.865609 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.866527 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-login\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.866810 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-user-template-error\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.867513 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.867698 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9542b67b-1582-4547-a94e-53952a29248e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.878091 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bng\" (UniqueName: \"kubernetes.io/projected/9542b67b-1582-4547-a94e-53952a29248e-kube-api-access-z2bng\") pod \"oauth-openshift-f54c45747-55xdv\" (UID: \"9542b67b-1582-4547-a94e-53952a29248e\") " pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.915579 4832 generic.go:334] "Generic (PLEG): container finished" podID="e422e2ec-57ff-4009-8065-bfa489b34334" containerID="a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4" exitCode=0 Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.915651 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.915656 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" event={"ID":"e422e2ec-57ff-4009-8065-bfa489b34334","Type":"ContainerDied","Data":"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4"} Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.915796 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sl4zz" event={"ID":"e422e2ec-57ff-4009-8065-bfa489b34334","Type":"ContainerDied","Data":"c9c0df18f5f57b232b73dd36a46dc439ebf0e31fcae0b2908750ca6fa19887a2"} Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.915822 4832 scope.go:117] "RemoveContainer" containerID="a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.933045 4832 scope.go:117] "RemoveContainer" containerID="a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4" Oct 03 00:12:18 crc kubenswrapper[4832]: E1003 00:12:18.933426 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4\": container with ID starting with a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4 not found: ID does not exist" containerID="a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.933461 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4"} err="failed to get container status \"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4\": rpc error: code = NotFound desc = could not find container \"a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4\": container with ID starting with a4a5ffc139efce7135582f03ab73f42d14b384b7e5ef935c34b1b475bdd5c4b4 not found: ID does not exist" Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.942237 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:12:18 crc kubenswrapper[4832]: I1003 00:12:18.944476 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sl4zz"] Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.008875 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.183627 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f54c45747-55xdv"] Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.392116 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e422e2ec-57ff-4009-8065-bfa489b34334" path="/var/lib/kubelet/pods/e422e2ec-57ff-4009-8065-bfa489b34334/volumes" Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.921985 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" event={"ID":"9542b67b-1582-4547-a94e-53952a29248e","Type":"ContainerStarted","Data":"6779b8a466406bf07d7f632cd40a6c301bf16458f8f0e48911fbfcf1a1f5257c"} Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.922022 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" event={"ID":"9542b67b-1582-4547-a94e-53952a29248e","Type":"ContainerStarted","Data":"ff1ccf5c460eac4d554a0bf041f2fe5814db1ed93fd68a43d6fa4a2c6e346b5a"} Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.922322 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.928255 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" Oct 03 00:12:19 crc kubenswrapper[4832]: I1003 00:12:19.940913 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f54c45747-55xdv" podStartSLOduration=26.940895936 podStartE2EDuration="26.940895936s" podCreationTimestamp="2025-10-03 00:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:12:19.938795941 +0000 UTC m=+333.200951047" watchObservedRunningTime="2025-10-03 00:12:19.940895936 +0000 UTC m=+333.203051032" Oct 03 00:12:26 crc kubenswrapper[4832]: I1003 00:12:26.396733 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:12:26 crc kubenswrapper[4832]: I1003 00:12:26.397320 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:12:56 crc kubenswrapper[4832]: I1003 00:12:56.396556 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:12:56 crc kubenswrapper[4832]: I1003 00:12:56.397227 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:13:26 crc kubenswrapper[4832]: I1003 00:13:26.396900 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:13:26 crc kubenswrapper[4832]: I1003 00:13:26.397501 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:13:26 crc kubenswrapper[4832]: I1003 00:13:26.397537 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:13:26 crc kubenswrapper[4832]: I1003 00:13:26.398012 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:13:26 crc kubenswrapper[4832]: I1003 00:13:26.398094 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e" gracePeriod=600 Oct 03 00:13:27 crc kubenswrapper[4832]: I1003 00:13:27.257523 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e" exitCode=0 Oct 03 00:13:27 crc kubenswrapper[4832]: I1003 00:13:27.257615 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e"} Oct 03 00:13:27 crc kubenswrapper[4832]: I1003 00:13:27.257895 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604"} Oct 03 00:13:27 crc kubenswrapper[4832]: I1003 00:13:27.257921 4832 scope.go:117] "RemoveContainer" containerID="786805325c320e5c08542c14a9ac9790f7c2dec237737fe1aca1523d1f2d26a5" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.704876 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-clgw8"] Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.706093 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.718341 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-clgw8"] Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.730926 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-registry-tls\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.730974 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/298154da-4f23-4641-9cfd-ae0742fa0663-ca-trust-extracted\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731005 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-bound-sa-token\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731025 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8lb\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-kube-api-access-qr8lb\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731051 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/298154da-4f23-4641-9cfd-ae0742fa0663-installation-pull-secrets\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731069 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-trusted-ca\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731101 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.731132 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-registry-certificates\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.769204 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-registry-certificates\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832108 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-registry-tls\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832137 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/298154da-4f23-4641-9cfd-ae0742fa0663-ca-trust-extracted\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832167 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-bound-sa-token\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832194 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8lb\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-kube-api-access-qr8lb\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832229 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/298154da-4f23-4641-9cfd-ae0742fa0663-installation-pull-secrets\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832251 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-trusted-ca\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.832946 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/298154da-4f23-4641-9cfd-ae0742fa0663-ca-trust-extracted\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.833352 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-registry-certificates\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.833709 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298154da-4f23-4641-9cfd-ae0742fa0663-trusted-ca\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.838162 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/298154da-4f23-4641-9cfd-ae0742fa0663-installation-pull-secrets\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.838479 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-registry-tls\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.853858 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8lb\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-kube-api-access-qr8lb\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:27 crc kubenswrapper[4832]: I1003 00:14:27.855101 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/298154da-4f23-4641-9cfd-ae0742fa0663-bound-sa-token\") pod \"image-registry-66df7c8f76-clgw8\" (UID: \"298154da-4f23-4641-9cfd-ae0742fa0663\") " pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:28 crc kubenswrapper[4832]: I1003 00:14:28.026630 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:28 crc kubenswrapper[4832]: I1003 00:14:28.203730 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-clgw8"] Oct 03 00:14:28 crc kubenswrapper[4832]: I1003 00:14:28.543468 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" event={"ID":"298154da-4f23-4641-9cfd-ae0742fa0663","Type":"ContainerStarted","Data":"25c322454a5a4aadc91eaa47d0d33109f2ff1586baef6ae7f9564014cf8b4414"} Oct 03 00:14:29 crc kubenswrapper[4832]: I1003 00:14:29.548787 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" event={"ID":"298154da-4f23-4641-9cfd-ae0742fa0663","Type":"ContainerStarted","Data":"1bad572092c7729bc6e905160f11f7f77ad7a9ffa8e284e7cffd811986bd1df8"} Oct 03 00:14:29 crc kubenswrapper[4832]: I1003 00:14:29.549115 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:29 crc kubenswrapper[4832]: I1003 00:14:29.567729 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" podStartSLOduration=2.567711296 podStartE2EDuration="2.567711296s" podCreationTimestamp="2025-10-03 00:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:14:29.565872547 +0000 UTC m=+462.828027663" watchObservedRunningTime="2025-10-03 00:14:29.567711296 +0000 UTC m=+462.829866402" Oct 03 00:14:48 crc kubenswrapper[4832]: I1003 00:14:48.031906 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-clgw8" Oct 03 00:14:48 crc kubenswrapper[4832]: I1003 00:14:48.076871 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.128858 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25"] Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.130670 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.132482 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.132891 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.140832 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25"] Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.244262 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.253793 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rp8v\" (UniqueName: \"kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.254043 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.355710 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rp8v\" (UniqueName: \"kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.355790 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.355854 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.357235 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.361872 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.371270 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rp8v\" (UniqueName: \"kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v\") pod \"collect-profiles-29324175-lcd25\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.448258 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.612744 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25"] Oct 03 00:15:00 crc kubenswrapper[4832]: I1003 00:15:00.702318 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" event={"ID":"68737edf-710b-4ea9-af60-305057b107a7","Type":"ContainerStarted","Data":"6d05daba14b500f2fabd04bd51815646a28a95385b7581f1b6af7b9b022b93d6"} Oct 03 00:15:01 crc kubenswrapper[4832]: I1003 00:15:01.708422 4832 generic.go:334] "Generic (PLEG): container finished" podID="68737edf-710b-4ea9-af60-305057b107a7" containerID="ff3300a908434b0d0e1c08621ab9f1d53afc6e142fb617ac051eedb58b10fe54" exitCode=0 Oct 03 00:15:01 crc kubenswrapper[4832]: I1003 00:15:01.708477 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" event={"ID":"68737edf-710b-4ea9-af60-305057b107a7","Type":"ContainerDied","Data":"ff3300a908434b0d0e1c08621ab9f1d53afc6e142fb617ac051eedb58b10fe54"} Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.900025 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.992467 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rp8v\" (UniqueName: \"kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v\") pod \"68737edf-710b-4ea9-af60-305057b107a7\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.992522 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume\") pod \"68737edf-710b-4ea9-af60-305057b107a7\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.992565 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume\") pod \"68737edf-710b-4ea9-af60-305057b107a7\" (UID: \"68737edf-710b-4ea9-af60-305057b107a7\") " Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.993252 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "68737edf-710b-4ea9-af60-305057b107a7" (UID: "68737edf-710b-4ea9-af60-305057b107a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:15:02 crc kubenswrapper[4832]: I1003 00:15:02.998694 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68737edf-710b-4ea9-af60-305057b107a7" (UID: "68737edf-710b-4ea9-af60-305057b107a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.000479 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v" (OuterVolumeSpecName: "kube-api-access-9rp8v") pod "68737edf-710b-4ea9-af60-305057b107a7" (UID: "68737edf-710b-4ea9-af60-305057b107a7"). InnerVolumeSpecName "kube-api-access-9rp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.094309 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rp8v\" (UniqueName: \"kubernetes.io/projected/68737edf-710b-4ea9-af60-305057b107a7-kube-api-access-9rp8v\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.094566 4832 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68737edf-710b-4ea9-af60-305057b107a7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.094575 4832 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68737edf-710b-4ea9-af60-305057b107a7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.717793 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" event={"ID":"68737edf-710b-4ea9-af60-305057b107a7","Type":"ContainerDied","Data":"6d05daba14b500f2fabd04bd51815646a28a95385b7581f1b6af7b9b022b93d6"} Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.717846 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d05daba14b500f2fabd04bd51815646a28a95385b7581f1b6af7b9b022b93d6" Oct 03 00:15:03 crc kubenswrapper[4832]: I1003 00:15:03.717819 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324175-lcd25" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.112414 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" podUID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" containerName="registry" containerID="cri-o://4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b" gracePeriod=30 Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.486383 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536103 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536208 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536248 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqc6m\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536336 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536379 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536565 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536589 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.536610 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates\") pod \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\" (UID: \"8793993f-a364-4eb7-bb10-d30f58f0e2c0\") " Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.537615 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.537655 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.537780 4832 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.537799 4832 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8793993f-a364-4eb7-bb10-d30f58f0e2c0-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.549051 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.549163 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m" (OuterVolumeSpecName: "kube-api-access-fqc6m") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "kube-api-access-fqc6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.549408 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.549443 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.549966 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.552415 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8793993f-a364-4eb7-bb10-d30f58f0e2c0" (UID: "8793993f-a364-4eb7-bb10-d30f58f0e2c0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.639006 4832 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8793993f-a364-4eb7-bb10-d30f58f0e2c0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.639042 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqc6m\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-kube-api-access-fqc6m\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.639053 4832 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.639062 4832 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8793993f-a364-4eb7-bb10-d30f58f0e2c0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.639072 4832 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8793993f-a364-4eb7-bb10-d30f58f0e2c0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.773879 4832 generic.go:334] "Generic (PLEG): container finished" podID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" containerID="4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b" exitCode=0 Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.773930 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" event={"ID":"8793993f-a364-4eb7-bb10-d30f58f0e2c0","Type":"ContainerDied","Data":"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b"} Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.773966 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" event={"ID":"8793993f-a364-4eb7-bb10-d30f58f0e2c0","Type":"ContainerDied","Data":"140f7e1baf89868a37933c11aa634084ea0b0c952f355d85990e729aa6bdb77e"} Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.773988 4832 scope.go:117] "RemoveContainer" containerID="4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.774368 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h4hkd" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.790220 4832 scope.go:117] "RemoveContainer" containerID="4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b" Oct 03 00:15:13 crc kubenswrapper[4832]: E1003 00:15:13.790822 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b\": container with ID starting with 4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b not found: ID does not exist" containerID="4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.790943 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b"} err="failed to get container status \"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b\": rpc error: code = NotFound desc = could not find container \"4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b\": container with ID starting with 4ca62c8692c8cc26e90fbdcc0cbe22ff5fac84414196c7ef81506108333cd86b not found: ID does not exist" Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.809119 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:15:13 crc kubenswrapper[4832]: I1003 00:15:13.810973 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h4hkd"] Oct 03 00:15:15 crc kubenswrapper[4832]: I1003 00:15:15.391966 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" path="/var/lib/kubelet/pods/8793993f-a364-4eb7-bb10-d30f58f0e2c0/volumes" Oct 03 00:15:26 crc kubenswrapper[4832]: I1003 00:15:26.397593 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:15:26 crc kubenswrapper[4832]: I1003 00:15:26.399249 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:15:56 crc kubenswrapper[4832]: I1003 00:15:56.396790 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:15:56 crc kubenswrapper[4832]: I1003 00:15:56.397385 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:16:26 crc kubenswrapper[4832]: I1003 00:16:26.397397 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:16:26 crc kubenswrapper[4832]: I1003 00:16:26.397981 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:16:26 crc kubenswrapper[4832]: I1003 00:16:26.398039 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:16:26 crc kubenswrapper[4832]: I1003 00:16:26.398560 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:16:26 crc kubenswrapper[4832]: I1003 00:16:26.398609 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604" gracePeriod=600 Oct 03 00:16:27 crc kubenswrapper[4832]: I1003 00:16:27.131504 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604" exitCode=0 Oct 03 00:16:27 crc kubenswrapper[4832]: I1003 00:16:27.131570 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604"} Oct 03 00:16:27 crc kubenswrapper[4832]: I1003 00:16:27.132120 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5"} Oct 03 00:16:27 crc kubenswrapper[4832]: I1003 00:16:27.132139 4832 scope.go:117] "RemoveContainer" containerID="aac53b9fd0b5433a7a5671b4f58bd6f53d19df336e73459d1ccd662324d7d19e" Oct 03 00:18:26 crc kubenswrapper[4832]: I1003 00:18:26.396467 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:18:26 crc kubenswrapper[4832]: I1003 00:18:26.397117 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:18:56 crc kubenswrapper[4832]: I1003 00:18:56.396822 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:18:56 crc kubenswrapper[4832]: I1003 00:18:56.397391 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.788950 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.789602 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" podUID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" containerName="controller-manager" containerID="cri-o://c463fb4ee70659b32d53c6bf73950119c1675dc95ab32f2547252918ee9db46c" gracePeriod=30 Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.895231 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.895671 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerName="route-controller-manager" containerID="cri-o://051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52" gracePeriod=30 Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.923562 4832 generic.go:334] "Generic (PLEG): container finished" podID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" containerID="c463fb4ee70659b32d53c6bf73950119c1675dc95ab32f2547252918ee9db46c" exitCode=0 Oct 03 00:19:14 crc kubenswrapper[4832]: I1003 00:19:14.923613 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" event={"ID":"42d0ed4c-fccc-4cec-bffb-60bf97896eb9","Type":"ContainerDied","Data":"c463fb4ee70659b32d53c6bf73950119c1675dc95ab32f2547252918ee9db46c"} Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.158899 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.212428 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config\") pod \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.212494 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles\") pod \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.212539 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert\") pod \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.212644 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmts7\" (UniqueName: \"kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7\") pod \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.212712 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca\") pod \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\" (UID: \"42d0ed4c-fccc-4cec-bffb-60bf97896eb9\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.213415 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "42d0ed4c-fccc-4cec-bffb-60bf97896eb9" (UID: "42d0ed4c-fccc-4cec-bffb-60bf97896eb9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.213480 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "42d0ed4c-fccc-4cec-bffb-60bf97896eb9" (UID: "42d0ed4c-fccc-4cec-bffb-60bf97896eb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.213574 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config" (OuterVolumeSpecName: "config") pod "42d0ed4c-fccc-4cec-bffb-60bf97896eb9" (UID: "42d0ed4c-fccc-4cec-bffb-60bf97896eb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.226156 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "42d0ed4c-fccc-4cec-bffb-60bf97896eb9" (UID: "42d0ed4c-fccc-4cec-bffb-60bf97896eb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.226168 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7" (OuterVolumeSpecName: "kube-api-access-zmts7") pod "42d0ed4c-fccc-4cec-bffb-60bf97896eb9" (UID: "42d0ed4c-fccc-4cec-bffb-60bf97896eb9"). InnerVolumeSpecName "kube-api-access-zmts7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.248241 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.313516 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert\") pod \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314130 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca\") pod \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314199 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjg2h\" (UniqueName: \"kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h\") pod \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314237 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config\") pod \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\" (UID: \"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64\") " Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314416 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmts7\" (UniqueName: \"kubernetes.io/projected/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-kube-api-access-zmts7\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314428 4832 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314436 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314445 4832 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.314453 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d0ed4c-fccc-4cec-bffb-60bf97896eb9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.315185 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config" (OuterVolumeSpecName: "config") pod "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" (UID: "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.315244 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" (UID: "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.316426 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" (UID: "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.317250 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h" (OuterVolumeSpecName: "kube-api-access-fjg2h") pod "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" (UID: "1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64"). InnerVolumeSpecName "kube-api-access-fjg2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.416660 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjg2h\" (UniqueName: \"kubernetes.io/projected/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-kube-api-access-fjg2h\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.416705 4832 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.416738 4832 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.416819 4832 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.929975 4832 generic.go:334] "Generic (PLEG): container finished" podID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerID="051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52" exitCode=0 Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.930023 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" event={"ID":"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64","Type":"ContainerDied","Data":"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52"} Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.930999 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" event={"ID":"1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64","Type":"ContainerDied","Data":"b312ff38ddd42529c6b71ee4109fe00dbd38cf6d7e5bc95d292b8141da8603a4"} Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.930046 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.931066 4832 scope.go:117] "RemoveContainer" containerID="051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.932359 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" event={"ID":"42d0ed4c-fccc-4cec-bffb-60bf97896eb9","Type":"ContainerDied","Data":"b0961b70467d802766ab122d53e01d0ed191326def62f02f0d439c65d35740f4"} Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.932439 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwvp8" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.945643 4832 scope.go:117] "RemoveContainer" containerID="051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52" Oct 03 00:19:15 crc kubenswrapper[4832]: E1003 00:19:15.946516 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52\": container with ID starting with 051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52 not found: ID does not exist" containerID="051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.946555 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52"} err="failed to get container status \"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52\": rpc error: code = NotFound desc = could not find container \"051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52\": container with ID starting with 051146b5d18de88d505bed620045a0b4cd585af66bc08241d031992851688f52 not found: ID does not exist" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.946585 4832 scope.go:117] "RemoveContainer" containerID="c463fb4ee70659b32d53c6bf73950119c1675dc95ab32f2547252918ee9db46c" Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.948777 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.951187 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hqmpt"] Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.958773 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:19:15 crc kubenswrapper[4832]: I1003 00:19:15.961746 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwvp8"] Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.699939 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77f94744cd-8ztqs"] Oct 03 00:19:16 crc kubenswrapper[4832]: E1003 00:19:16.700374 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" containerName="registry" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700386 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" containerName="registry" Oct 03 00:19:16 crc kubenswrapper[4832]: E1003 00:19:16.700404 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" containerName="controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700410 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" containerName="controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: E1003 00:19:16.700421 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerName="route-controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700428 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerName="route-controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: E1003 00:19:16.700435 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68737edf-710b-4ea9-af60-305057b107a7" containerName="collect-profiles" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700443 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="68737edf-710b-4ea9-af60-305057b107a7" containerName="collect-profiles" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700525 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="8793993f-a364-4eb7-bb10-d30f58f0e2c0" containerName="registry" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700537 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="68737edf-710b-4ea9-af60-305057b107a7" containerName="collect-profiles" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700545 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" containerName="controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700556 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" containerName="route-controller-manager" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.700882 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.702698 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.702731 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.703133 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.703577 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.705614 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.706700 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p"] Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.706985 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.707642 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.710179 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.710262 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.710268 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.710427 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.711674 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.711899 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.713111 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.714178 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77f94744cd-8ztqs"] Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.737037 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p"] Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831296 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-proxy-ca-bundles\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831358 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-config\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831442 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5e6991-167e-4d4e-af26-ddd48b261d9a-serving-cert\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831474 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b984cd-5d51-4f70-854f-d82ad8b46e45-serving-cert\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831501 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-client-ca\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831538 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-config\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831560 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v78w\" (UniqueName: \"kubernetes.io/projected/12b984cd-5d51-4f70-854f-d82ad8b46e45-kube-api-access-7v78w\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831587 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5sns\" (UniqueName: \"kubernetes.io/projected/3a5e6991-167e-4d4e-af26-ddd48b261d9a-kube-api-access-c5sns\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.831606 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-client-ca\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.932842 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-proxy-ca-bundles\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.932903 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-config\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.932960 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5e6991-167e-4d4e-af26-ddd48b261d9a-serving-cert\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.932993 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b984cd-5d51-4f70-854f-d82ad8b46e45-serving-cert\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.933015 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-client-ca\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.933044 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-config\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.933064 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v78w\" (UniqueName: \"kubernetes.io/projected/12b984cd-5d51-4f70-854f-d82ad8b46e45-kube-api-access-7v78w\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.933087 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5sns\" (UniqueName: \"kubernetes.io/projected/3a5e6991-167e-4d4e-af26-ddd48b261d9a-kube-api-access-c5sns\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.933105 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-client-ca\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.934194 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-client-ca\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.934273 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-proxy-ca-bundles\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.934536 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-client-ca\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.934601 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b984cd-5d51-4f70-854f-d82ad8b46e45-config\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.934998 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5e6991-167e-4d4e-af26-ddd48b261d9a-config\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.937635 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5e6991-167e-4d4e-af26-ddd48b261d9a-serving-cert\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.948516 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b984cd-5d51-4f70-854f-d82ad8b46e45-serving-cert\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.959744 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5sns\" (UniqueName: \"kubernetes.io/projected/3a5e6991-167e-4d4e-af26-ddd48b261d9a-kube-api-access-c5sns\") pod \"route-controller-manager-777bc85584-pz27p\" (UID: \"3a5e6991-167e-4d4e-af26-ddd48b261d9a\") " pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:16 crc kubenswrapper[4832]: I1003 00:19:16.960986 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v78w\" (UniqueName: \"kubernetes.io/projected/12b984cd-5d51-4f70-854f-d82ad8b46e45-kube-api-access-7v78w\") pod \"controller-manager-77f94744cd-8ztqs\" (UID: \"12b984cd-5d51-4f70-854f-d82ad8b46e45\") " pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.019072 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.034360 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.393940 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64" path="/var/lib/kubelet/pods/1d8a5bbc-dc0e-4b7b-b571-83b7be0abe64/volumes" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.394717 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d0ed4c-fccc-4cec-bffb-60bf97896eb9" path="/var/lib/kubelet/pods/42d0ed4c-fccc-4cec-bffb-60bf97896eb9/volumes" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.458965 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77f94744cd-8ztqs"] Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.461689 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p"] Oct 03 00:19:17 crc kubenswrapper[4832]: W1003 00:19:17.466910 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a5e6991_167e_4d4e_af26_ddd48b261d9a.slice/crio-14f9ec02e1b0f9db5212af03439699d93e3c2db11d17cfd8bde87a0d2494c03b WatchSource:0}: Error finding container 14f9ec02e1b0f9db5212af03439699d93e3c2db11d17cfd8bde87a0d2494c03b: Status 404 returned error can't find the container with id 14f9ec02e1b0f9db5212af03439699d93e3c2db11d17cfd8bde87a0d2494c03b Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.954098 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" event={"ID":"12b984cd-5d51-4f70-854f-d82ad8b46e45","Type":"ContainerStarted","Data":"aee080c87db30337a856edc231c97849e73bdb6d47a1763d9c25c2741bab7817"} Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.954142 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" event={"ID":"12b984cd-5d51-4f70-854f-d82ad8b46e45","Type":"ContainerStarted","Data":"d6ad19e3e0e86c8c2551f4b59a9ca911d094c800392f5fcb5f3e53459b22a109"} Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.954490 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.956056 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" event={"ID":"3a5e6991-167e-4d4e-af26-ddd48b261d9a","Type":"ContainerStarted","Data":"4fbceb7254fda4e580034dc82c125d6f845b59c3f1d3c20fc16006f7d2a7c567"} Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.956379 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" event={"ID":"3a5e6991-167e-4d4e-af26-ddd48b261d9a","Type":"ContainerStarted","Data":"14f9ec02e1b0f9db5212af03439699d93e3c2db11d17cfd8bde87a0d2494c03b"} Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.956400 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.960785 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" Oct 03 00:19:17 crc kubenswrapper[4832]: I1003 00:19:17.995238 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77f94744cd-8ztqs" podStartSLOduration=3.9952143639999997 podStartE2EDuration="3.995214364s" podCreationTimestamp="2025-10-03 00:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:19:17.972401786 +0000 UTC m=+751.234556902" watchObservedRunningTime="2025-10-03 00:19:17.995214364 +0000 UTC m=+751.257369470" Oct 03 00:19:18 crc kubenswrapper[4832]: I1003 00:19:18.187578 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" Oct 03 00:19:18 crc kubenswrapper[4832]: I1003 00:19:18.203845 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-777bc85584-pz27p" podStartSLOduration=4.203829114 podStartE2EDuration="4.203829114s" podCreationTimestamp="2025-10-03 00:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:19:18.025147964 +0000 UTC m=+751.287303090" watchObservedRunningTime="2025-10-03 00:19:18.203829114 +0000 UTC m=+751.465984220" Oct 03 00:19:21 crc kubenswrapper[4832]: I1003 00:19:21.870321 4832 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 00:19:26 crc kubenswrapper[4832]: I1003 00:19:26.397644 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:19:26 crc kubenswrapper[4832]: I1003 00:19:26.398027 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:19:26 crc kubenswrapper[4832]: I1003 00:19:26.398076 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:19:26 crc kubenswrapper[4832]: I1003 00:19:26.398675 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:19:26 crc kubenswrapper[4832]: I1003 00:19:26.398784 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5" gracePeriod=600 Oct 03 00:19:27 crc kubenswrapper[4832]: I1003 00:19:27.001712 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5" exitCode=0 Oct 03 00:19:27 crc kubenswrapper[4832]: I1003 00:19:27.001788 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5"} Oct 03 00:19:27 crc kubenswrapper[4832]: I1003 00:19:27.002134 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98"} Oct 03 00:19:27 crc kubenswrapper[4832]: I1003 00:19:27.002164 4832 scope.go:117] "RemoveContainer" containerID="ccf77e912021030d6d11e5b9f7160acaf1d5ac5755da48831bdb9eaaf439f604" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.103640 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.105330 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.118872 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.187851 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.187938 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prtt\" (UniqueName: \"kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.188041 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.289441 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.289495 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prtt\" (UniqueName: \"kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.289560 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.290083 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.290391 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.317872 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prtt\" (UniqueName: \"kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt\") pod \"redhat-operators-hrdvn\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.423395 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:47 crc kubenswrapper[4832]: I1003 00:19:47.820319 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:19:47 crc kubenswrapper[4832]: W1003 00:19:47.825659 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96948108_5cf0_4f64_b5e9_571f42b5baad.slice/crio-6558cc94fa25097657aabd517975373d1b74ed12c2a8d1b0f08d578aee2bfd0b WatchSource:0}: Error finding container 6558cc94fa25097657aabd517975373d1b74ed12c2a8d1b0f08d578aee2bfd0b: Status 404 returned error can't find the container with id 6558cc94fa25097657aabd517975373d1b74ed12c2a8d1b0f08d578aee2bfd0b Oct 03 00:19:48 crc kubenswrapper[4832]: I1003 00:19:48.135771 4832 generic.go:334] "Generic (PLEG): container finished" podID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerID="69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794" exitCode=0 Oct 03 00:19:48 crc kubenswrapper[4832]: I1003 00:19:48.135815 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerDied","Data":"69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794"} Oct 03 00:19:48 crc kubenswrapper[4832]: I1003 00:19:48.135847 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerStarted","Data":"6558cc94fa25097657aabd517975373d1b74ed12c2a8d1b0f08d578aee2bfd0b"} Oct 03 00:19:48 crc kubenswrapper[4832]: I1003 00:19:48.137613 4832 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 00:19:49 crc kubenswrapper[4832]: I1003 00:19:49.143197 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerStarted","Data":"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70"} Oct 03 00:19:50 crc kubenswrapper[4832]: I1003 00:19:50.148522 4832 generic.go:334] "Generic (PLEG): container finished" podID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerID="75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70" exitCode=0 Oct 03 00:19:50 crc kubenswrapper[4832]: I1003 00:19:50.148561 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerDied","Data":"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70"} Oct 03 00:19:51 crc kubenswrapper[4832]: I1003 00:19:51.155166 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerStarted","Data":"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6"} Oct 03 00:19:51 crc kubenswrapper[4832]: I1003 00:19:51.172011 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hrdvn" podStartSLOduration=1.727238338 podStartE2EDuration="4.171994247s" podCreationTimestamp="2025-10-03 00:19:47 +0000 UTC" firstStartedPulling="2025-10-03 00:19:48.137361315 +0000 UTC m=+781.399516421" lastFinishedPulling="2025-10-03 00:19:50.582117224 +0000 UTC m=+783.844272330" observedRunningTime="2025-10-03 00:19:51.170583782 +0000 UTC m=+784.432738898" watchObservedRunningTime="2025-10-03 00:19:51.171994247 +0000 UTC m=+784.434149353" Oct 03 00:19:57 crc kubenswrapper[4832]: I1003 00:19:57.424167 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:57 crc kubenswrapper[4832]: I1003 00:19:57.424581 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:57 crc kubenswrapper[4832]: I1003 00:19:57.460155 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:58 crc kubenswrapper[4832]: I1003 00:19:58.224543 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:19:58 crc kubenswrapper[4832]: I1003 00:19:58.260557 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.197701 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hrdvn" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="registry-server" containerID="cri-o://7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6" gracePeriod=2 Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.555182 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.646791 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities\") pod \"96948108-5cf0-4f64-b5e9-571f42b5baad\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.646848 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8prtt\" (UniqueName: \"kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt\") pod \"96948108-5cf0-4f64-b5e9-571f42b5baad\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.646891 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content\") pod \"96948108-5cf0-4f64-b5e9-571f42b5baad\" (UID: \"96948108-5cf0-4f64-b5e9-571f42b5baad\") " Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.647862 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities" (OuterVolumeSpecName: "utilities") pod "96948108-5cf0-4f64-b5e9-571f42b5baad" (UID: "96948108-5cf0-4f64-b5e9-571f42b5baad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.652478 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt" (OuterVolumeSpecName: "kube-api-access-8prtt") pod "96948108-5cf0-4f64-b5e9-571f42b5baad" (UID: "96948108-5cf0-4f64-b5e9-571f42b5baad"). InnerVolumeSpecName "kube-api-access-8prtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.729451 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96948108-5cf0-4f64-b5e9-571f42b5baad" (UID: "96948108-5cf0-4f64-b5e9-571f42b5baad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.748086 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.748126 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96948108-5cf0-4f64-b5e9-571f42b5baad-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:00 crc kubenswrapper[4832]: I1003 00:20:00.748141 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8prtt\" (UniqueName: \"kubernetes.io/projected/96948108-5cf0-4f64-b5e9-571f42b5baad-kube-api-access-8prtt\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.209566 4832 generic.go:334] "Generic (PLEG): container finished" podID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerID="7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6" exitCode=0 Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.209637 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrdvn" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.209641 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerDied","Data":"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6"} Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.209814 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrdvn" event={"ID":"96948108-5cf0-4f64-b5e9-571f42b5baad","Type":"ContainerDied","Data":"6558cc94fa25097657aabd517975373d1b74ed12c2a8d1b0f08d578aee2bfd0b"} Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.209846 4832 scope.go:117] "RemoveContainer" containerID="7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.235489 4832 scope.go:117] "RemoveContainer" containerID="75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.251991 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.258430 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hrdvn"] Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.269039 4832 scope.go:117] "RemoveContainer" containerID="69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.283889 4832 scope.go:117] "RemoveContainer" containerID="7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6" Oct 03 00:20:01 crc kubenswrapper[4832]: E1003 00:20:01.284299 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6\": container with ID starting with 7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6 not found: ID does not exist" containerID="7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.284326 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6"} err="failed to get container status \"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6\": rpc error: code = NotFound desc = could not find container \"7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6\": container with ID starting with 7ab236871001e5d39c2c323ea03e5c11d62f605759ce9b7fc54dae2f5fb94ae6 not found: ID does not exist" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.284347 4832 scope.go:117] "RemoveContainer" containerID="75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70" Oct 03 00:20:01 crc kubenswrapper[4832]: E1003 00:20:01.284601 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70\": container with ID starting with 75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70 not found: ID does not exist" containerID="75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.284624 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70"} err="failed to get container status \"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70\": rpc error: code = NotFound desc = could not find container \"75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70\": container with ID starting with 75f1a8fabec062e8ef7935058559c24ef98070c6f03f258b629bb56259e9ed70 not found: ID does not exist" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.284637 4832 scope.go:117] "RemoveContainer" containerID="69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794" Oct 03 00:20:01 crc kubenswrapper[4832]: E1003 00:20:01.284979 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794\": container with ID starting with 69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794 not found: ID does not exist" containerID="69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.284997 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794"} err="failed to get container status \"69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794\": rpc error: code = NotFound desc = could not find container \"69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794\": container with ID starting with 69d59b1863f581af11288ee2335eb550140c86a9a8bc83db12a88c0fc2091794 not found: ID does not exist" Oct 03 00:20:01 crc kubenswrapper[4832]: I1003 00:20:01.393583 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" path="/var/lib/kubelet/pods/96948108-5cf0-4f64-b5e9-571f42b5baad/volumes" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.298016 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:40 crc kubenswrapper[4832]: E1003 00:20:40.298691 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="extract-content" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.298702 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="extract-content" Oct 03 00:20:40 crc kubenswrapper[4832]: E1003 00:20:40.298715 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="extract-utilities" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.298720 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="extract-utilities" Oct 03 00:20:40 crc kubenswrapper[4832]: E1003 00:20:40.298734 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="registry-server" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.298741 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="registry-server" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.298837 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="96948108-5cf0-4f64-b5e9-571f42b5baad" containerName="registry-server" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.299505 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.327153 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.431550 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zszk\" (UniqueName: \"kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.431678 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.431702 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.533338 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.533385 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.533444 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zszk\" (UniqueName: \"kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.533959 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.533964 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.551856 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zszk\" (UniqueName: \"kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk\") pod \"redhat-marketplace-7qm46\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:40 crc kubenswrapper[4832]: I1003 00:20:40.616801 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:41 crc kubenswrapper[4832]: I1003 00:20:41.015368 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:41 crc kubenswrapper[4832]: I1003 00:20:41.410961 4832 generic.go:334] "Generic (PLEG): container finished" podID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerID="5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b" exitCode=0 Oct 03 00:20:41 crc kubenswrapper[4832]: I1003 00:20:41.410998 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerDied","Data":"5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b"} Oct 03 00:20:41 crc kubenswrapper[4832]: I1003 00:20:41.411023 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerStarted","Data":"010607491e4f325ee7aaf90d1d46dbca23f0fa38ad36b02f0939f06c95434ce6"} Oct 03 00:20:42 crc kubenswrapper[4832]: I1003 00:20:42.417644 4832 generic.go:334] "Generic (PLEG): container finished" podID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerID="5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7" exitCode=0 Oct 03 00:20:42 crc kubenswrapper[4832]: I1003 00:20:42.417872 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerDied","Data":"5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7"} Oct 03 00:20:43 crc kubenswrapper[4832]: I1003 00:20:43.424542 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerStarted","Data":"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda"} Oct 03 00:20:43 crc kubenswrapper[4832]: I1003 00:20:43.443081 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7qm46" podStartSLOduration=1.9309663769999998 podStartE2EDuration="3.443063326s" podCreationTimestamp="2025-10-03 00:20:40 +0000 UTC" firstStartedPulling="2025-10-03 00:20:41.412788961 +0000 UTC m=+834.674944067" lastFinishedPulling="2025-10-03 00:20:42.92488591 +0000 UTC m=+836.187041016" observedRunningTime="2025-10-03 00:20:43.441303401 +0000 UTC m=+836.703458517" watchObservedRunningTime="2025-10-03 00:20:43.443063326 +0000 UTC m=+836.705218432" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.375751 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.376917 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.386845 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.527382 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzss4\" (UniqueName: \"kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.527516 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.527679 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.629323 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzss4\" (UniqueName: \"kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.629379 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.629442 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.629926 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.630152 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.657892 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzss4\" (UniqueName: \"kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4\") pod \"certified-operators-72k5k\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.692627 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:48 crc kubenswrapper[4832]: I1003 00:20:48.960664 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:20:49 crc kubenswrapper[4832]: I1003 00:20:49.451300 4832 generic.go:334] "Generic (PLEG): container finished" podID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerID="8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20" exitCode=0 Oct 03 00:20:49 crc kubenswrapper[4832]: I1003 00:20:49.451342 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerDied","Data":"8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20"} Oct 03 00:20:49 crc kubenswrapper[4832]: I1003 00:20:49.451368 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerStarted","Data":"c6289cab7c6dbc6255cc2655f13146c63e5b3b11df9b41b0d4eaa2df785c1cdb"} Oct 03 00:20:50 crc kubenswrapper[4832]: I1003 00:20:50.461992 4832 generic.go:334] "Generic (PLEG): container finished" podID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerID="bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9" exitCode=0 Oct 03 00:20:50 crc kubenswrapper[4832]: I1003 00:20:50.462034 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerDied","Data":"bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9"} Oct 03 00:20:50 crc kubenswrapper[4832]: I1003 00:20:50.617275 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:50 crc kubenswrapper[4832]: I1003 00:20:50.617329 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:50 crc kubenswrapper[4832]: I1003 00:20:50.652288 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:51 crc kubenswrapper[4832]: I1003 00:20:51.470831 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerStarted","Data":"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690"} Oct 03 00:20:51 crc kubenswrapper[4832]: I1003 00:20:51.490084 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72k5k" podStartSLOduration=1.9468426669999999 podStartE2EDuration="3.490071174s" podCreationTimestamp="2025-10-03 00:20:48 +0000 UTC" firstStartedPulling="2025-10-03 00:20:49.452821002 +0000 UTC m=+842.714976108" lastFinishedPulling="2025-10-03 00:20:50.996049509 +0000 UTC m=+844.258204615" observedRunningTime="2025-10-03 00:20:51.487133703 +0000 UTC m=+844.749288809" watchObservedRunningTime="2025-10-03 00:20:51.490071174 +0000 UTC m=+844.752226280" Oct 03 00:20:51 crc kubenswrapper[4832]: I1003 00:20:51.507869 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:52 crc kubenswrapper[4832]: I1003 00:20:52.953357 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.479967 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7qm46" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="registry-server" containerID="cri-o://1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda" gracePeriod=2 Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.882582 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.992545 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zszk\" (UniqueName: \"kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk\") pod \"d47f949d-a8be-45c6-b112-9421b16e4e64\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.993576 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities\") pod \"d47f949d-a8be-45c6-b112-9421b16e4e64\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.993619 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content\") pod \"d47f949d-a8be-45c6-b112-9421b16e4e64\" (UID: \"d47f949d-a8be-45c6-b112-9421b16e4e64\") " Oct 03 00:20:53 crc kubenswrapper[4832]: I1003 00:20:53.994369 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities" (OuterVolumeSpecName: "utilities") pod "d47f949d-a8be-45c6-b112-9421b16e4e64" (UID: "d47f949d-a8be-45c6-b112-9421b16e4e64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.005012 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk" (OuterVolumeSpecName: "kube-api-access-2zszk") pod "d47f949d-a8be-45c6-b112-9421b16e4e64" (UID: "d47f949d-a8be-45c6-b112-9421b16e4e64"). InnerVolumeSpecName "kube-api-access-2zszk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.014061 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d47f949d-a8be-45c6-b112-9421b16e4e64" (UID: "d47f949d-a8be-45c6-b112-9421b16e4e64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.094636 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zszk\" (UniqueName: \"kubernetes.io/projected/d47f949d-a8be-45c6-b112-9421b16e4e64-kube-api-access-2zszk\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.094937 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.095029 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47f949d-a8be-45c6-b112-9421b16e4e64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.485931 4832 generic.go:334] "Generic (PLEG): container finished" podID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerID="1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda" exitCode=0 Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.485977 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerDied","Data":"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda"} Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.486007 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qm46" event={"ID":"d47f949d-a8be-45c6-b112-9421b16e4e64","Type":"ContainerDied","Data":"010607491e4f325ee7aaf90d1d46dbca23f0fa38ad36b02f0939f06c95434ce6"} Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.486027 4832 scope.go:117] "RemoveContainer" containerID="1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.486193 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qm46" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.500219 4832 scope.go:117] "RemoveContainer" containerID="5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.517635 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.517877 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qm46"] Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.522766 4832 scope.go:117] "RemoveContainer" containerID="5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.548661 4832 scope.go:117] "RemoveContainer" containerID="1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda" Oct 03 00:20:54 crc kubenswrapper[4832]: E1003 00:20:54.549164 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda\": container with ID starting with 1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda not found: ID does not exist" containerID="1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.549203 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda"} err="failed to get container status \"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda\": rpc error: code = NotFound desc = could not find container \"1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda\": container with ID starting with 1bd3d3edd18d2ffb1a40c48075cb4bdc7f8ce09ce616869de2e24990ebcb7dda not found: ID does not exist" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.549229 4832 scope.go:117] "RemoveContainer" containerID="5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7" Oct 03 00:20:54 crc kubenswrapper[4832]: E1003 00:20:54.550201 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7\": container with ID starting with 5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7 not found: ID does not exist" containerID="5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.550236 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7"} err="failed to get container status \"5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7\": rpc error: code = NotFound desc = could not find container \"5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7\": container with ID starting with 5d29bbc583e51f4b73151679576a576643c8209c2e263b56891609a40367bcc7 not found: ID does not exist" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.550253 4832 scope.go:117] "RemoveContainer" containerID="5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b" Oct 03 00:20:54 crc kubenswrapper[4832]: E1003 00:20:54.550625 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b\": container with ID starting with 5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b not found: ID does not exist" containerID="5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b" Oct 03 00:20:54 crc kubenswrapper[4832]: I1003 00:20:54.550666 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b"} err="failed to get container status \"5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b\": rpc error: code = NotFound desc = could not find container \"5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b\": container with ID starting with 5e699851165713d21eb470ccda9bc2106c487a538ecd32be877133b4916c7b6b not found: ID does not exist" Oct 03 00:20:55 crc kubenswrapper[4832]: I1003 00:20:55.392204 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" path="/var/lib/kubelet/pods/d47f949d-a8be-45c6-b112-9421b16e4e64/volumes" Oct 03 00:20:58 crc kubenswrapper[4832]: I1003 00:20:58.693213 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:58 crc kubenswrapper[4832]: I1003 00:20:58.693556 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:58 crc kubenswrapper[4832]: I1003 00:20:58.731549 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:59 crc kubenswrapper[4832]: I1003 00:20:59.541672 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:20:59 crc kubenswrapper[4832]: I1003 00:20:59.580432 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:21:01 crc kubenswrapper[4832]: I1003 00:21:01.529791 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72k5k" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="registry-server" containerID="cri-o://b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690" gracePeriod=2 Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.383041 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.495030 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content\") pod \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.495152 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzss4\" (UniqueName: \"kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4\") pod \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.495220 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities\") pod \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\" (UID: \"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d\") " Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.496493 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities" (OuterVolumeSpecName: "utilities") pod "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" (UID: "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.505612 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4" (OuterVolumeSpecName: "kube-api-access-tzss4") pod "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" (UID: "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d"). InnerVolumeSpecName "kube-api-access-tzss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.536059 4832 generic.go:334] "Generic (PLEG): container finished" podID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerID="b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690" exitCode=0 Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.536106 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerDied","Data":"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690"} Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.536136 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72k5k" event={"ID":"46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d","Type":"ContainerDied","Data":"c6289cab7c6dbc6255cc2655f13146c63e5b3b11df9b41b0d4eaa2df785c1cdb"} Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.536154 4832 scope.go:117] "RemoveContainer" containerID="b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.536239 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72k5k" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.542982 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" (UID: "46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.553203 4832 scope.go:117] "RemoveContainer" containerID="bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.568595 4832 scope.go:117] "RemoveContainer" containerID="8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.584215 4832 scope.go:117] "RemoveContainer" containerID="b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690" Oct 03 00:21:02 crc kubenswrapper[4832]: E1003 00:21:02.584856 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690\": container with ID starting with b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690 not found: ID does not exist" containerID="b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.584900 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690"} err="failed to get container status \"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690\": rpc error: code = NotFound desc = could not find container \"b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690\": container with ID starting with b715909d48a4a84ff5f847d593d67e483aef2445722209a8c41ff3e830db2690 not found: ID does not exist" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.584929 4832 scope.go:117] "RemoveContainer" containerID="bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9" Oct 03 00:21:02 crc kubenswrapper[4832]: E1003 00:21:02.585301 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9\": container with ID starting with bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9 not found: ID does not exist" containerID="bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.585337 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9"} err="failed to get container status \"bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9\": rpc error: code = NotFound desc = could not find container \"bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9\": container with ID starting with bbfe77f85f4784bf00acf364985edb5bbab793415803c0769dad86bb22540bf9 not found: ID does not exist" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.585367 4832 scope.go:117] "RemoveContainer" containerID="8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20" Oct 03 00:21:02 crc kubenswrapper[4832]: E1003 00:21:02.585617 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20\": container with ID starting with 8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20 not found: ID does not exist" containerID="8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.585648 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20"} err="failed to get container status \"8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20\": rpc error: code = NotFound desc = could not find container \"8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20\": container with ID starting with 8aab4771baf2a5fabde4739b35b494dccef1cbf52c2f1a7121fdb3e97a6c8b20 not found: ID does not exist" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.597523 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzss4\" (UniqueName: \"kubernetes.io/projected/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-kube-api-access-tzss4\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.597741 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.597841 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.867385 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:21:02 crc kubenswrapper[4832]: I1003 00:21:02.872099 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72k5k"] Oct 03 00:21:03 crc kubenswrapper[4832]: I1003 00:21:03.392706 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" path="/var/lib/kubelet/pods/46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d/volumes" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.443936 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444475 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="extract-content" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444492 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="extract-content" Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444511 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="extract-utilities" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444517 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="extract-utilities" Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444524 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444531 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444540 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444545 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444556 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="extract-content" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444563 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="extract-content" Oct 03 00:21:05 crc kubenswrapper[4832]: E1003 00:21:05.444575 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="extract-utilities" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444581 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="extract-utilities" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444658 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fbcd8c-bf77-46b4-acb9-25eb4bb0d75d" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.444671 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="d47f949d-a8be-45c6-b112-9421b16e4e64" containerName="registry-server" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.445984 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.460070 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.529614 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpwm4\" (UniqueName: \"kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.529795 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.529868 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.631490 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpwm4\" (UniqueName: \"kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.631578 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.631640 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.632182 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.632280 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.649797 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpwm4\" (UniqueName: \"kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4\") pod \"community-operators-zfn2d\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:05 crc kubenswrapper[4832]: I1003 00:21:05.765112 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:06 crc kubenswrapper[4832]: I1003 00:21:05.999948 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:06 crc kubenswrapper[4832]: I1003 00:21:06.557925 4832 generic.go:334] "Generic (PLEG): container finished" podID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerID="f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c" exitCode=0 Oct 03 00:21:06 crc kubenswrapper[4832]: I1003 00:21:06.558784 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerDied","Data":"f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c"} Oct 03 00:21:06 crc kubenswrapper[4832]: I1003 00:21:06.558867 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerStarted","Data":"efdf649b2410f833379d6ac51942edb5e52b46f12ab65c682e2898fb3a97eb1f"} Oct 03 00:21:07 crc kubenswrapper[4832]: I1003 00:21:07.565330 4832 generic.go:334] "Generic (PLEG): container finished" podID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerID="7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b" exitCode=0 Oct 03 00:21:07 crc kubenswrapper[4832]: I1003 00:21:07.565375 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerDied","Data":"7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b"} Oct 03 00:21:08 crc kubenswrapper[4832]: I1003 00:21:08.572022 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerStarted","Data":"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326"} Oct 03 00:21:08 crc kubenswrapper[4832]: I1003 00:21:08.588925 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zfn2d" podStartSLOduration=2.010183496 podStartE2EDuration="3.588907401s" podCreationTimestamp="2025-10-03 00:21:05 +0000 UTC" firstStartedPulling="2025-10-03 00:21:06.560854518 +0000 UTC m=+859.823009624" lastFinishedPulling="2025-10-03 00:21:08.139578423 +0000 UTC m=+861.401733529" observedRunningTime="2025-10-03 00:21:08.58726594 +0000 UTC m=+861.849421056" watchObservedRunningTime="2025-10-03 00:21:08.588907401 +0000 UTC m=+861.851062507" Oct 03 00:21:15 crc kubenswrapper[4832]: I1003 00:21:15.765543 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:15 crc kubenswrapper[4832]: I1003 00:21:15.766042 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:15 crc kubenswrapper[4832]: I1003 00:21:15.802401 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:16 crc kubenswrapper[4832]: I1003 00:21:16.637803 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:16 crc kubenswrapper[4832]: I1003 00:21:16.674780 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:18 crc kubenswrapper[4832]: I1003 00:21:18.616658 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zfn2d" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="registry-server" containerID="cri-o://0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326" gracePeriod=2 Oct 03 00:21:18 crc kubenswrapper[4832]: I1003 00:21:18.954239 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.079066 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities\") pod \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.079172 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content\") pod \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.079245 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpwm4\" (UniqueName: \"kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4\") pod \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\" (UID: \"9656d881-eeb6-44aa-a89a-dfffd1ba51fd\") " Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.080006 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities" (OuterVolumeSpecName: "utilities") pod "9656d881-eeb6-44aa-a89a-dfffd1ba51fd" (UID: "9656d881-eeb6-44aa-a89a-dfffd1ba51fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.084906 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4" (OuterVolumeSpecName: "kube-api-access-lpwm4") pod "9656d881-eeb6-44aa-a89a-dfffd1ba51fd" (UID: "9656d881-eeb6-44aa-a89a-dfffd1ba51fd"). InnerVolumeSpecName "kube-api-access-lpwm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.128409 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9656d881-eeb6-44aa-a89a-dfffd1ba51fd" (UID: "9656d881-eeb6-44aa-a89a-dfffd1ba51fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.180481 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.180538 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpwm4\" (UniqueName: \"kubernetes.io/projected/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-kube-api-access-lpwm4\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.180552 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9656d881-eeb6-44aa-a89a-dfffd1ba51fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.623329 4832 generic.go:334] "Generic (PLEG): container finished" podID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerID="0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326" exitCode=0 Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.623369 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerDied","Data":"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326"} Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.623391 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfn2d" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.623428 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfn2d" event={"ID":"9656d881-eeb6-44aa-a89a-dfffd1ba51fd","Type":"ContainerDied","Data":"efdf649b2410f833379d6ac51942edb5e52b46f12ab65c682e2898fb3a97eb1f"} Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.623447 4832 scope.go:117] "RemoveContainer" containerID="0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.642766 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.649016 4832 scope.go:117] "RemoveContainer" containerID="7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.649388 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zfn2d"] Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.661888 4832 scope.go:117] "RemoveContainer" containerID="f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.685000 4832 scope.go:117] "RemoveContainer" containerID="0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326" Oct 03 00:21:19 crc kubenswrapper[4832]: E1003 00:21:19.685450 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326\": container with ID starting with 0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326 not found: ID does not exist" containerID="0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.685563 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326"} err="failed to get container status \"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326\": rpc error: code = NotFound desc = could not find container \"0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326\": container with ID starting with 0c6383de67371c62d7014db8d28369566f3fa8302670112d6bef4cc29384c326 not found: ID does not exist" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.685650 4832 scope.go:117] "RemoveContainer" containerID="7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b" Oct 03 00:21:19 crc kubenswrapper[4832]: E1003 00:21:19.686170 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b\": container with ID starting with 7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b not found: ID does not exist" containerID="7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.686266 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b"} err="failed to get container status \"7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b\": rpc error: code = NotFound desc = could not find container \"7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b\": container with ID starting with 7923fc0c92d29c77e0c4abb7df443ce05eb8fc7df2c55e47c74d5b315fcad06b not found: ID does not exist" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.686357 4832 scope.go:117] "RemoveContainer" containerID="f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c" Oct 03 00:21:19 crc kubenswrapper[4832]: E1003 00:21:19.686749 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c\": container with ID starting with f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c not found: ID does not exist" containerID="f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c" Oct 03 00:21:19 crc kubenswrapper[4832]: I1003 00:21:19.686794 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c"} err="failed to get container status \"f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c\": rpc error: code = NotFound desc = could not find container \"f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c\": container with ID starting with f5f81a02c62ac3f19df2233de0abc0ec4a865f05e94a01f8913a74bea932859c not found: ID does not exist" Oct 03 00:21:21 crc kubenswrapper[4832]: I1003 00:21:21.399530 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" path="/var/lib/kubelet/pods/9656d881-eeb6-44aa-a89a-dfffd1ba51fd/volumes" Oct 03 00:21:26 crc kubenswrapper[4832]: I1003 00:21:26.397453 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:21:26 crc kubenswrapper[4832]: I1003 00:21:26.397822 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.789543 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zc9xl"] Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790228 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790230 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="sbdb" containerID="cri-o://ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790295 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-node" containerID="cri-o://2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790347 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-acl-logging" containerID="cri-o://e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790336 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="northd" containerID="cri-o://7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790203 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="nbdb" containerID="cri-o://a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.790136 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-controller" containerID="cri-o://1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" gracePeriod=30 Oct 03 00:21:49 crc kubenswrapper[4832]: I1003 00:21:49.817018 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" containerID="cri-o://331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" gracePeriod=30 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.076660 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/3.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.079300 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovn-acl-logging/0.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.079775 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovn-controller/0.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.080220 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134139 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ctmv8"] Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134370 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134385 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134395 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134403 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134412 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134419 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134428 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="sbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134436 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="sbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134448 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134456 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134466 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-acl-logging" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134472 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-acl-logging" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134484 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kubecfg-setup" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134489 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kubecfg-setup" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134496 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="extract-utilities" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134502 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="extract-utilities" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134510 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-node" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134516 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-node" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134521 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="northd" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134527 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="northd" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134537 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="extract-content" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134542 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="extract-content" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134551 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134559 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134566 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="registry-server" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134572 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="registry-server" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134582 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="nbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134587 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="nbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134675 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-acl-logging" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134685 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="sbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134692 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="nbdb" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134699 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134707 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134714 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovn-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134722 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134729 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="9656d881-eeb6-44aa-a89a-dfffd1ba51fd" containerName="registry-server" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134737 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134744 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134772 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="kube-rbac-proxy-node" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134780 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="northd" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134862 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134868 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: E1003 00:21:50.134879 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.134885 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.135001 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" containerName="ovnkube-controller" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.136914 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148091 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148144 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148180 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148201 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148224 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148254 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148289 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148318 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148349 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlv87\" (UniqueName: \"kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148374 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148404 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148441 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148472 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148504 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148535 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148559 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148631 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148655 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148678 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148695 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log\") pod \"0f189ec1-8169-4886-bff6-f67f7958050e\" (UID: \"0f189ec1-8169-4886-bff6-f67f7958050e\") " Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.148967 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log" (OuterVolumeSpecName: "node-log") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149009 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149037 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket" (OuterVolumeSpecName: "log-socket") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149060 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149083 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash" (OuterVolumeSpecName: "host-slash") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149105 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.149541 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.150227 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.152460 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.152485 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.152793 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.152811 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.153161 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.153185 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.153596 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.153613 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.153634 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.155571 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87" (OuterVolumeSpecName: "kube-api-access-nlv87") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "kube-api-access-nlv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.156118 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.163362 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0f189ec1-8169-4886-bff6-f67f7958050e" (UID: "0f189ec1-8169-4886-bff6-f67f7958050e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.249856 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-node-log\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.249916 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-slash\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.249951 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-etc-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.249981 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-var-lib-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250079 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgcd\" (UniqueName: \"kubernetes.io/projected/283fb468-33d8-48c8-8b41-c9c9df346eae-kube-api-access-blgcd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250153 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-config\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250188 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-netns\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250221 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-script-lib\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250241 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-systemd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250260 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250439 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-kubelet\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250482 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-netd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250517 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-ovn\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250556 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250630 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-systemd-units\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250688 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-bin\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250714 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-env-overrides\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250733 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250748 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-log-socket\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250790 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/283fb468-33d8-48c8-8b41-c9c9df346eae-ovn-node-metrics-cert\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250870 4832 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250886 4832 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250896 4832 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250905 4832 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250916 4832 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250926 4832 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250936 4832 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250945 4832 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250954 4832 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250965 4832 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250974 4832 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250984 4832 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.250993 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlv87\" (UniqueName: \"kubernetes.io/projected/0f189ec1-8169-4886-bff6-f67f7958050e-kube-api-access-nlv87\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251004 4832 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251013 4832 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251022 4832 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251032 4832 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f189ec1-8169-4886-bff6-f67f7958050e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251040 4832 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251048 4832 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f189ec1-8169-4886-bff6-f67f7958050e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.251056 4832 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f189ec1-8169-4886-bff6-f67f7958050e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352146 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-etc-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352546 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-var-lib-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352635 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgcd\" (UniqueName: \"kubernetes.io/projected/283fb468-33d8-48c8-8b41-c9c9df346eae-kube-api-access-blgcd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352720 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-config\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352824 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-netns\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352306 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-etc-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352574 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-var-lib-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352989 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-netns\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.352931 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-script-lib\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353068 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-systemd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353101 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353132 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-kubelet\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353147 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-netd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353168 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-ovn\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353168 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-systemd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353184 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-run-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353196 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353219 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-ovn\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353222 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-netd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353254 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353284 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-systemd-units\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353261 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-systemd-units\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353320 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-bin\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353303 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-kubelet\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353361 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-cni-bin\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353339 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-env-overrides\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353438 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353469 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-log-socket\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353492 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/283fb468-33d8-48c8-8b41-c9c9df346eae-ovn-node-metrics-cert\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353517 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-run-openvswitch\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353531 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-log-socket\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353541 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-node-log\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353560 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-node-log\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353568 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-slash\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353649 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/283fb468-33d8-48c8-8b41-c9c9df346eae-host-slash\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.353955 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-env-overrides\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.354080 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-config\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.354610 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/283fb468-33d8-48c8-8b41-c9c9df346eae-ovnkube-script-lib\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.356868 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/283fb468-33d8-48c8-8b41-c9c9df346eae-ovn-node-metrics-cert\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.368728 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgcd\" (UniqueName: \"kubernetes.io/projected/283fb468-33d8-48c8-8b41-c9c9df346eae-kube-api-access-blgcd\") pod \"ovnkube-node-ctmv8\" (UID: \"283fb468-33d8-48c8-8b41-c9c9df346eae\") " pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.475276 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.779682 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/2.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.780226 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/1.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.780269 4832 generic.go:334] "Generic (PLEG): container finished" podID="96bf1ee1-9249-4db9-83da-17eb15aa5f9a" containerID="f7b3d48173bbf797245933ace78f31ccd4545faddc97703430950e2068752402" exitCode=2 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.780327 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerDied","Data":"f7b3d48173bbf797245933ace78f31ccd4545faddc97703430950e2068752402"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.780361 4832 scope.go:117] "RemoveContainer" containerID="d3584877a37d5f67dfaff9f091443ff884963e53e62bde6c8cefc55988d8c3a3" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.780911 4832 scope.go:117] "RemoveContainer" containerID="f7b3d48173bbf797245933ace78f31ccd4545faddc97703430950e2068752402" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.782149 4832 generic.go:334] "Generic (PLEG): container finished" podID="283fb468-33d8-48c8-8b41-c9c9df346eae" containerID="4e14fdcadf1c46bd495505a7721c03947b791ab75797a680e45bb0897dee44f3" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.782244 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerDied","Data":"4e14fdcadf1c46bd495505a7721c03947b791ab75797a680e45bb0897dee44f3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.782316 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"7d6e689e7a1bf461acaa914053a9f853604712b0369112c9592c2f78954c75eb"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.785875 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovnkube-controller/3.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.791074 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovn-acl-logging/0.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.792484 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zc9xl_0f189ec1-8169-4886-bff6-f67f7958050e/ovn-controller/0.log" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793135 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793156 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793164 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793174 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793182 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793191 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" exitCode=0 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793199 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" exitCode=143 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793208 4832 generic.go:334] "Generic (PLEG): container finished" podID="0f189ec1-8169-4886-bff6-f67f7958050e" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" exitCode=143 Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793231 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793263 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793278 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793292 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793306 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793319 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793333 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793346 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793353 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793360 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793368 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793375 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793382 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793388 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793395 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793402 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793411 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793422 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793431 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793438 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793444 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793451 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793458 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793465 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793472 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793479 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793485 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793495 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793506 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793527 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793535 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793542 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793550 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793557 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793564 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793570 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793577 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793583 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793593 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" event={"ID":"0f189ec1-8169-4886-bff6-f67f7958050e","Type":"ContainerDied","Data":"893a5dd7a6050b0f460f109807438c3564225c0c9254db0c028cc87736b5b8d7"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793605 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793612 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793620 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793626 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793632 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793640 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793646 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793652 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793659 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.793666 4832 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.796563 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zc9xl" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.816907 4832 scope.go:117] "RemoveContainer" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.856198 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.863746 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zc9xl"] Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.867388 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zc9xl"] Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.893333 4832 scope.go:117] "RemoveContainer" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.931428 4832 scope.go:117] "RemoveContainer" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.945316 4832 scope.go:117] "RemoveContainer" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.960359 4832 scope.go:117] "RemoveContainer" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:50 crc kubenswrapper[4832]: I1003 00:21:50.985211 4832 scope.go:117] "RemoveContainer" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.004566 4832 scope.go:117] "RemoveContainer" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.016975 4832 scope.go:117] "RemoveContainer" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.031066 4832 scope.go:117] "RemoveContainer" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.052763 4832 scope.go:117] "RemoveContainer" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.053252 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": container with ID starting with 331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae not found: ID does not exist" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.053278 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} err="failed to get container status \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": rpc error: code = NotFound desc = could not find container \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": container with ID starting with 331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.053297 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.053534 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": container with ID starting with e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3 not found: ID does not exist" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.053558 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} err="failed to get container status \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": rpc error: code = NotFound desc = could not find container \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": container with ID starting with e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.053576 4832 scope.go:117] "RemoveContainer" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.055307 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": container with ID starting with ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b not found: ID does not exist" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055331 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} err="failed to get container status \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": rpc error: code = NotFound desc = could not find container \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": container with ID starting with ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055346 4832 scope.go:117] "RemoveContainer" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.055634 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": container with ID starting with a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332 not found: ID does not exist" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055660 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} err="failed to get container status \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": rpc error: code = NotFound desc = could not find container \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": container with ID starting with a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055677 4832 scope.go:117] "RemoveContainer" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.055961 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": container with ID starting with 7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3 not found: ID does not exist" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055979 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} err="failed to get container status \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": rpc error: code = NotFound desc = could not find container \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": container with ID starting with 7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.055991 4832 scope.go:117] "RemoveContainer" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.056193 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": container with ID starting with f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f not found: ID does not exist" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.056233 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} err="failed to get container status \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": rpc error: code = NotFound desc = could not find container \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": container with ID starting with f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.056245 4832 scope.go:117] "RemoveContainer" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.056656 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": container with ID starting with 2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f not found: ID does not exist" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.056679 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} err="failed to get container status \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": rpc error: code = NotFound desc = could not find container \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": container with ID starting with 2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.056695 4832 scope.go:117] "RemoveContainer" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.057296 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": container with ID starting with e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4 not found: ID does not exist" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.057350 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} err="failed to get container status \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": rpc error: code = NotFound desc = could not find container \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": container with ID starting with e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.057384 4832 scope.go:117] "RemoveContainer" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.057843 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": container with ID starting with 1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535 not found: ID does not exist" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.057870 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} err="failed to get container status \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": rpc error: code = NotFound desc = could not find container \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": container with ID starting with 1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.057890 4832 scope.go:117] "RemoveContainer" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: E1003 00:21:51.058188 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": container with ID starting with 0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6 not found: ID does not exist" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058222 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} err="failed to get container status \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": rpc error: code = NotFound desc = could not find container \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": container with ID starting with 0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058243 4832 scope.go:117] "RemoveContainer" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058486 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} err="failed to get container status \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": rpc error: code = NotFound desc = could not find container \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": container with ID starting with 331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058512 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058858 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} err="failed to get container status \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": rpc error: code = NotFound desc = could not find container \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": container with ID starting with e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.058886 4832 scope.go:117] "RemoveContainer" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059198 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} err="failed to get container status \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": rpc error: code = NotFound desc = could not find container \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": container with ID starting with ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059226 4832 scope.go:117] "RemoveContainer" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059531 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} err="failed to get container status \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": rpc error: code = NotFound desc = could not find container \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": container with ID starting with a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059555 4832 scope.go:117] "RemoveContainer" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059823 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} err="failed to get container status \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": rpc error: code = NotFound desc = could not find container \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": container with ID starting with 7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.059847 4832 scope.go:117] "RemoveContainer" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060126 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} err="failed to get container status \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": rpc error: code = NotFound desc = could not find container \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": container with ID starting with f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060149 4832 scope.go:117] "RemoveContainer" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060386 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} err="failed to get container status \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": rpc error: code = NotFound desc = could not find container \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": container with ID starting with 2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060410 4832 scope.go:117] "RemoveContainer" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060668 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} err="failed to get container status \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": rpc error: code = NotFound desc = could not find container \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": container with ID starting with e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.060725 4832 scope.go:117] "RemoveContainer" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061026 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} err="failed to get container status \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": rpc error: code = NotFound desc = could not find container \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": container with ID starting with 1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061049 4832 scope.go:117] "RemoveContainer" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061308 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} err="failed to get container status \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": rpc error: code = NotFound desc = could not find container \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": container with ID starting with 0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061328 4832 scope.go:117] "RemoveContainer" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061552 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} err="failed to get container status \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": rpc error: code = NotFound desc = could not find container \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": container with ID starting with 331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061568 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061774 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} err="failed to get container status \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": rpc error: code = NotFound desc = could not find container \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": container with ID starting with e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.061794 4832 scope.go:117] "RemoveContainer" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062041 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} err="failed to get container status \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": rpc error: code = NotFound desc = could not find container \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": container with ID starting with ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062062 4832 scope.go:117] "RemoveContainer" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062300 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} err="failed to get container status \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": rpc error: code = NotFound desc = could not find container \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": container with ID starting with a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062318 4832 scope.go:117] "RemoveContainer" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062535 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} err="failed to get container status \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": rpc error: code = NotFound desc = could not find container \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": container with ID starting with 7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062556 4832 scope.go:117] "RemoveContainer" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062820 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} err="failed to get container status \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": rpc error: code = NotFound desc = could not find container \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": container with ID starting with f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.062843 4832 scope.go:117] "RemoveContainer" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063033 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} err="failed to get container status \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": rpc error: code = NotFound desc = could not find container \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": container with ID starting with 2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063052 4832 scope.go:117] "RemoveContainer" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063293 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} err="failed to get container status \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": rpc error: code = NotFound desc = could not find container \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": container with ID starting with e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063316 4832 scope.go:117] "RemoveContainer" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063515 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} err="failed to get container status \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": rpc error: code = NotFound desc = could not find container \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": container with ID starting with 1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063532 4832 scope.go:117] "RemoveContainer" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063730 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} err="failed to get container status \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": rpc error: code = NotFound desc = could not find container \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": container with ID starting with 0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063744 4832 scope.go:117] "RemoveContainer" containerID="331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.063994 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae"} err="failed to get container status \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": rpc error: code = NotFound desc = could not find container \"331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae\": container with ID starting with 331f9777df369634e691a52d3564282da8b3d170cca2aad394f2d3e7888228ae not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.064008 4832 scope.go:117] "RemoveContainer" containerID="e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.064217 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3"} err="failed to get container status \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": rpc error: code = NotFound desc = could not find container \"e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3\": container with ID starting with e1fbfcaa81a9cf32a47adcb39be9ce4e2d5f3b9cd8b8e8556cea49fd099ccef3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.064237 4832 scope.go:117] "RemoveContainer" containerID="ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.064435 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b"} err="failed to get container status \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": rpc error: code = NotFound desc = could not find container \"ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b\": container with ID starting with ae1e89f85d166170fd478f58e653b46eaa1c8335c1b124af9fde5b805864197b not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.064448 4832 scope.go:117] "RemoveContainer" containerID="a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.065511 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332"} err="failed to get container status \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": rpc error: code = NotFound desc = could not find container \"a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332\": container with ID starting with a3b296d00954bf341fee4ad258c6ee44adbd2950c89e73aaf5141201b61c5332 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.065554 4832 scope.go:117] "RemoveContainer" containerID="7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.065866 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3"} err="failed to get container status \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": rpc error: code = NotFound desc = could not find container \"7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3\": container with ID starting with 7712a571bceb2c88adb90f9f18cdef6f7666e96341c1f9c87e6a55c82b59efd3 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.065887 4832 scope.go:117] "RemoveContainer" containerID="f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.066304 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f"} err="failed to get container status \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": rpc error: code = NotFound desc = could not find container \"f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f\": container with ID starting with f2c63d70335d75dec7e19ae160dfa2a00f834d9aa3725bb4b63b94aa488fca0f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.066325 4832 scope.go:117] "RemoveContainer" containerID="2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.066710 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f"} err="failed to get container status \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": rpc error: code = NotFound desc = could not find container \"2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f\": container with ID starting with 2ba40dbb3c6b603f4c75b3f017b0b8953a189aac6dba4cc0e1fa5174b6353f3f not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.066728 4832 scope.go:117] "RemoveContainer" containerID="e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.066994 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4"} err="failed to get container status \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": rpc error: code = NotFound desc = could not find container \"e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4\": container with ID starting with e6ee507d1127a1ca82e96f57a031cfef9adbd0b310588ce49a32f0cbcab7b3f4 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.067014 4832 scope.go:117] "RemoveContainer" containerID="1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.067284 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535"} err="failed to get container status \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": rpc error: code = NotFound desc = could not find container \"1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535\": container with ID starting with 1e493b0aab45e7a7842b48f0a13a6850b3bcd8a8f2147880ff278b92b04e9535 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.067307 4832 scope.go:117] "RemoveContainer" containerID="0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.067490 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6"} err="failed to get container status \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": rpc error: code = NotFound desc = could not find container \"0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6\": container with ID starting with 0669e6b46f57beece4c83940bcffe580a4b4e2db5451a9c9ca4a9ad486fc95b6 not found: ID does not exist" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.394330 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f189ec1-8169-4886-bff6-f67f7958050e" path="/var/lib/kubelet/pods/0f189ec1-8169-4886-bff6-f67f7958050e/volumes" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.803614 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"a9b04abdbc2fdc1958cd2e86dca83f09e2762a350c23b6cdae63e0e9d49f7dea"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.805074 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"afc2ece009df7a2863964d90c36caac89bbd9e80e94353f517fdca25bcbde556"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.805228 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"35b2055b1ed90a3f63da2db0dde56ee8aad140ecb0cb685426ae39717dea1571"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.805352 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"7fb599a47bc033cc6fada9b291c0823c952d9e14b3efd42b19eacf65b86522cb"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.805468 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"32b1e2d72b18a40690c70ef6d34bef2ecffa8e921d56d336fe9bd7da849a8cbb"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.805590 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"e1bb3aabcb197117dfa0615757239073d7f27926ba913dd4bc79ff7cfa3c7774"} Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.809573 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwqt7_96bf1ee1-9249-4db9-83da-17eb15aa5f9a/kube-multus/2.log" Oct 03 00:21:51 crc kubenswrapper[4832]: I1003 00:21:51.809635 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwqt7" event={"ID":"96bf1ee1-9249-4db9-83da-17eb15aa5f9a","Type":"ContainerStarted","Data":"53ea86180129fe6fa128a0c2edd3c607ccb054d2111689c10311da4193d07dda"} Oct 03 00:21:53 crc kubenswrapper[4832]: I1003 00:21:53.823293 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"6afd0c77d7fcba36b981f3e84c6ae48398846b167e78ca09b458a2436b6f65a0"} Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.836828 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" event={"ID":"283fb468-33d8-48c8-8b41-c9c9df346eae","Type":"ContainerStarted","Data":"058007574cf7584a61a0d08cbd8dd0748ca5653fa4bd1ac9fcead938fb6d36e0"} Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.837160 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.837172 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.837183 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.865671 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.868446 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" podStartSLOduration=5.868432478 podStartE2EDuration="5.868432478s" podCreationTimestamp="2025-10-03 00:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:21:55.868052326 +0000 UTC m=+909.130207452" watchObservedRunningTime="2025-10-03 00:21:55.868432478 +0000 UTC m=+909.130587594" Oct 03 00:21:55 crc kubenswrapper[4832]: I1003 00:21:55.880091 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:21:56 crc kubenswrapper[4832]: I1003 00:21:56.396967 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:21:56 crc kubenswrapper[4832]: I1003 00:21:56.397261 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:22:20 crc kubenswrapper[4832]: I1003 00:22:20.495078 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ctmv8" Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.432089 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.432430 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.432468 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.432841 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.432901 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98" gracePeriod=600 Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.990521 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98" exitCode=0 Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.990594 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98"} Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.990863 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0"} Oct 03 00:22:26 crc kubenswrapper[4832]: I1003 00:22:26.990889 4832 scope.go:117] "RemoveContainer" containerID="fcefa3daf9b82c6a6a361876ef1ec6ea0451ec83cac8c17a21a11140c7e5a7a5" Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.608271 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.608996 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f9csr" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="registry-server" containerID="cri-o://177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b" gracePeriod=30 Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.944120 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.994989 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities\") pod \"c40eb97c-936a-4d46-9016-3a062386ae3c\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.995082 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2mk\" (UniqueName: \"kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk\") pod \"c40eb97c-936a-4d46-9016-3a062386ae3c\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.995221 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content\") pod \"c40eb97c-936a-4d46-9016-3a062386ae3c\" (UID: \"c40eb97c-936a-4d46-9016-3a062386ae3c\") " Oct 03 00:23:09 crc kubenswrapper[4832]: I1003 00:23:09.996140 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities" (OuterVolumeSpecName: "utilities") pod "c40eb97c-936a-4d46-9016-3a062386ae3c" (UID: "c40eb97c-936a-4d46-9016-3a062386ae3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.001516 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk" (OuterVolumeSpecName: "kube-api-access-xn2mk") pod "c40eb97c-936a-4d46-9016-3a062386ae3c" (UID: "c40eb97c-936a-4d46-9016-3a062386ae3c"). InnerVolumeSpecName "kube-api-access-xn2mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.007131 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c40eb97c-936a-4d46-9016-3a062386ae3c" (UID: "c40eb97c-936a-4d46-9016-3a062386ae3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.096292 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2mk\" (UniqueName: \"kubernetes.io/projected/c40eb97c-936a-4d46-9016-3a062386ae3c-kube-api-access-xn2mk\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.096564 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.096625 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40eb97c-936a-4d46-9016-3a062386ae3c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.188956 4832 generic.go:334] "Generic (PLEG): container finished" podID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerID="177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b" exitCode=0 Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.189021 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9csr" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.189036 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerDied","Data":"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b"} Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.189327 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9csr" event={"ID":"c40eb97c-936a-4d46-9016-3a062386ae3c","Type":"ContainerDied","Data":"4c8a3d61f2ab3118f38a914b09efdb23cc3d01ef05ed0f64ed799d6ac19f41cc"} Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.189345 4832 scope.go:117] "RemoveContainer" containerID="177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.211316 4832 scope.go:117] "RemoveContainer" containerID="09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.215709 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.218769 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9csr"] Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.235017 4832 scope.go:117] "RemoveContainer" containerID="36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.245114 4832 scope.go:117] "RemoveContainer" containerID="177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b" Oct 03 00:23:10 crc kubenswrapper[4832]: E1003 00:23:10.245453 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b\": container with ID starting with 177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b not found: ID does not exist" containerID="177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.245503 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b"} err="failed to get container status \"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b\": rpc error: code = NotFound desc = could not find container \"177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b\": container with ID starting with 177c9d05c3dec624ffb0ffc23ee090c6ada16e25553b570d5d51818e8edc9d0b not found: ID does not exist" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.245529 4832 scope.go:117] "RemoveContainer" containerID="09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c" Oct 03 00:23:10 crc kubenswrapper[4832]: E1003 00:23:10.245814 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c\": container with ID starting with 09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c not found: ID does not exist" containerID="09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.245863 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c"} err="failed to get container status \"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c\": rpc error: code = NotFound desc = could not find container \"09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c\": container with ID starting with 09b7ee49ed3a96cdf565a16828e513898cbc11bbd226ed20c84c2ea84c5f135c not found: ID does not exist" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.245882 4832 scope.go:117] "RemoveContainer" containerID="36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1" Oct 03 00:23:10 crc kubenswrapper[4832]: E1003 00:23:10.246223 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1\": container with ID starting with 36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1 not found: ID does not exist" containerID="36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1" Oct 03 00:23:10 crc kubenswrapper[4832]: I1003 00:23:10.246272 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1"} err="failed to get container status \"36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1\": rpc error: code = NotFound desc = could not find container \"36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1\": container with ID starting with 36eab17ae289307ea9104a10805e37cf6b5e6ab3696f482fb176dd7d09d406a1 not found: ID does not exist" Oct 03 00:23:11 crc kubenswrapper[4832]: I1003 00:23:11.393473 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" path="/var/lib/kubelet/pods/c40eb97c-936a-4d46-9016-3a062386ae3c/volumes" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.398085 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j"] Oct 03 00:23:13 crc kubenswrapper[4832]: E1003 00:23:13.398535 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="registry-server" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.398549 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="registry-server" Oct 03 00:23:13 crc kubenswrapper[4832]: E1003 00:23:13.398559 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="extract-utilities" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.398565 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="extract-utilities" Oct 03 00:23:13 crc kubenswrapper[4832]: E1003 00:23:13.398576 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="extract-content" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.398583 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="extract-content" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.398678 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40eb97c-936a-4d46-9016-3a062386ae3c" containerName="registry-server" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.399401 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.401369 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.406426 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j"] Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.437318 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbqt\" (UniqueName: \"kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.437537 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.437649 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.539018 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.539135 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbqt\" (UniqueName: \"kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.539182 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.539947 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.540136 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.557789 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbqt\" (UniqueName: \"kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.717828 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:13 crc kubenswrapper[4832]: I1003 00:23:13.934293 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j"] Oct 03 00:23:14 crc kubenswrapper[4832]: I1003 00:23:14.209026 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerStarted","Data":"0269a20dd8417873168c4a15f05fcf586d28a5ab5b4e67d2f553332d673fcc3d"} Oct 03 00:23:14 crc kubenswrapper[4832]: I1003 00:23:14.209277 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerStarted","Data":"e63c5322de4efd53bcce60c8db12da84ce888486804b46f2a902e0f571b000f6"} Oct 03 00:23:15 crc kubenswrapper[4832]: I1003 00:23:15.214569 4832 generic.go:334] "Generic (PLEG): container finished" podID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerID="0269a20dd8417873168c4a15f05fcf586d28a5ab5b4e67d2f553332d673fcc3d" exitCode=0 Oct 03 00:23:15 crc kubenswrapper[4832]: I1003 00:23:15.214635 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerDied","Data":"0269a20dd8417873168c4a15f05fcf586d28a5ab5b4e67d2f553332d673fcc3d"} Oct 03 00:23:17 crc kubenswrapper[4832]: I1003 00:23:17.227551 4832 generic.go:334] "Generic (PLEG): container finished" podID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerID="3583cb49b2ded66d5314f5f3b5969a106de2e03928eb29d7b8e6fb4c6d2d6d24" exitCode=0 Oct 03 00:23:17 crc kubenswrapper[4832]: I1003 00:23:17.227667 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerDied","Data":"3583cb49b2ded66d5314f5f3b5969a106de2e03928eb29d7b8e6fb4c6d2d6d24"} Oct 03 00:23:18 crc kubenswrapper[4832]: I1003 00:23:18.233812 4832 generic.go:334] "Generic (PLEG): container finished" podID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerID="ce459dda5c8eaa8288a9ec8f2c70283987703d3956346cf8f6ed0891646df1a1" exitCode=0 Oct 03 00:23:18 crc kubenswrapper[4832]: I1003 00:23:18.233899 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerDied","Data":"ce459dda5c8eaa8288a9ec8f2c70283987703d3956346cf8f6ed0891646df1a1"} Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.429594 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.551005 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util\") pod \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.551050 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsbqt\" (UniqueName: \"kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt\") pod \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.551075 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle\") pod \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\" (UID: \"70a18e90-8200-46bb-b7e4-cde7e0ff2742\") " Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.553221 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle" (OuterVolumeSpecName: "bundle") pod "70a18e90-8200-46bb-b7e4-cde7e0ff2742" (UID: "70a18e90-8200-46bb-b7e4-cde7e0ff2742"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.557889 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt" (OuterVolumeSpecName: "kube-api-access-qsbqt") pod "70a18e90-8200-46bb-b7e4-cde7e0ff2742" (UID: "70a18e90-8200-46bb-b7e4-cde7e0ff2742"). InnerVolumeSpecName "kube-api-access-qsbqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.652054 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsbqt\" (UniqueName: \"kubernetes.io/projected/70a18e90-8200-46bb-b7e4-cde7e0ff2742-kube-api-access-qsbqt\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.652097 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.806031 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9"] Oct 03 00:23:19 crc kubenswrapper[4832]: E1003 00:23:19.806245 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="util" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.806256 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="util" Oct 03 00:23:19 crc kubenswrapper[4832]: E1003 00:23:19.806264 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="pull" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.806269 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="pull" Oct 03 00:23:19 crc kubenswrapper[4832]: E1003 00:23:19.806281 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="extract" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.806287 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="extract" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.806392 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a18e90-8200-46bb-b7e4-cde7e0ff2742" containerName="extract" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.807067 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.812510 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9"] Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.838702 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util" (OuterVolumeSpecName: "util") pod "70a18e90-8200-46bb-b7e4-cde7e0ff2742" (UID: "70a18e90-8200-46bb-b7e4-cde7e0ff2742"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.854248 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70a18e90-8200-46bb-b7e4-cde7e0ff2742-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.955060 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.955103 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tshfr\" (UniqueName: \"kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:19 crc kubenswrapper[4832]: I1003 00:23:19.955144 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.056374 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.056426 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tshfr\" (UniqueName: \"kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.056467 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.056877 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.056904 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.071371 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tshfr\" (UniqueName: \"kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.123307 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.245367 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" event={"ID":"70a18e90-8200-46bb-b7e4-cde7e0ff2742","Type":"ContainerDied","Data":"e63c5322de4efd53bcce60c8db12da84ce888486804b46f2a902e0f571b000f6"} Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.245400 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e63c5322de4efd53bcce60c8db12da84ce888486804b46f2a902e0f571b000f6" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.245433 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.273375 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9"] Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.591800 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt"] Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.592890 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.600936 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt"] Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.665656 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.665722 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.665766 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chn7w\" (UniqueName: \"kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.766779 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.766847 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.766874 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chn7w\" (UniqueName: \"kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.767373 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.767310 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.783426 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chn7w\" (UniqueName: \"kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w\") pod \"dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:20 crc kubenswrapper[4832]: I1003 00:23:20.905902 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.068428 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt"] Oct 03 00:23:21 crc kubenswrapper[4832]: W1003 00:23:21.072606 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd031b5fc_f2b1_4d17_bbaf_bdee585e907d.slice/crio-c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3 WatchSource:0}: Error finding container c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3: Status 404 returned error can't find the container with id c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3 Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.251850 4832 generic.go:334] "Generic (PLEG): container finished" podID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerID="860a4b5e5b9cdc9709dec793f22e769a2c832a74d813e6fda53c2cc6e5ccafaa" exitCode=0 Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.251947 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" event={"ID":"d031b5fc-f2b1-4d17-bbaf-bdee585e907d","Type":"ContainerDied","Data":"860a4b5e5b9cdc9709dec793f22e769a2c832a74d813e6fda53c2cc6e5ccafaa"} Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.251980 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" event={"ID":"d031b5fc-f2b1-4d17-bbaf-bdee585e907d","Type":"ContainerStarted","Data":"c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3"} Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.254279 4832 generic.go:334] "Generic (PLEG): container finished" podID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerID="8e45cfdc84885def613d4cbaaac3839a4b2183636cc8371c263311f89b8c10b8" exitCode=0 Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.254306 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerDied","Data":"8e45cfdc84885def613d4cbaaac3839a4b2183636cc8371c263311f89b8c10b8"} Oct 03 00:23:21 crc kubenswrapper[4832]: I1003 00:23:21.254322 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerStarted","Data":"3b73fd21587c160c1e9fef228c09a8e6fdb1de6baa5c8885a7869d5095983742"} Oct 03 00:23:23 crc kubenswrapper[4832]: I1003 00:23:23.263113 4832 generic.go:334] "Generic (PLEG): container finished" podID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerID="8ad03ad3a0081c518f4d835f1cd37a8620167edf46cff60daa3754c875f93650" exitCode=0 Oct 03 00:23:23 crc kubenswrapper[4832]: I1003 00:23:23.263219 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" event={"ID":"d031b5fc-f2b1-4d17-bbaf-bdee585e907d","Type":"ContainerDied","Data":"8ad03ad3a0081c518f4d835f1cd37a8620167edf46cff60daa3754c875f93650"} Oct 03 00:23:23 crc kubenswrapper[4832]: I1003 00:23:23.266084 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerStarted","Data":"a9e1da0ce5db6c32840092c6849a1c57dcd9224126ebd10b6d6bd5e4f480b355"} Oct 03 00:23:24 crc kubenswrapper[4832]: I1003 00:23:24.272643 4832 generic.go:334] "Generic (PLEG): container finished" podID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerID="cb43c92ef7c253d59de773b57f8d15ede64ec2ed2ebdcc58b6ca0b92ff20c09a" exitCode=0 Oct 03 00:23:24 crc kubenswrapper[4832]: I1003 00:23:24.272830 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" event={"ID":"d031b5fc-f2b1-4d17-bbaf-bdee585e907d","Type":"ContainerDied","Data":"cb43c92ef7c253d59de773b57f8d15ede64ec2ed2ebdcc58b6ca0b92ff20c09a"} Oct 03 00:23:24 crc kubenswrapper[4832]: I1003 00:23:24.275101 4832 generic.go:334] "Generic (PLEG): container finished" podID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerID="a9e1da0ce5db6c32840092c6849a1c57dcd9224126ebd10b6d6bd5e4f480b355" exitCode=0 Oct 03 00:23:24 crc kubenswrapper[4832]: I1003 00:23:24.275144 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerDied","Data":"a9e1da0ce5db6c32840092c6849a1c57dcd9224126ebd10b6d6bd5e4f480b355"} Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.281931 4832 generic.go:334] "Generic (PLEG): container finished" podID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerID="9850ef2b0f5667ffc51e6f7fc8619ca808e7d704f8b306ed5d311bc4c5581f50" exitCode=0 Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.281995 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerDied","Data":"9850ef2b0f5667ffc51e6f7fc8619ca808e7d704f8b306ed5d311bc4c5581f50"} Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.615648 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.726825 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle\") pod \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.726916 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chn7w\" (UniqueName: \"kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w\") pod \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.726971 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util\") pod \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\" (UID: \"d031b5fc-f2b1-4d17-bbaf-bdee585e907d\") " Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.733449 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle" (OuterVolumeSpecName: "bundle") pod "d031b5fc-f2b1-4d17-bbaf-bdee585e907d" (UID: "d031b5fc-f2b1-4d17-bbaf-bdee585e907d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.739825 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w" (OuterVolumeSpecName: "kube-api-access-chn7w") pod "d031b5fc-f2b1-4d17-bbaf-bdee585e907d" (UID: "d031b5fc-f2b1-4d17-bbaf-bdee585e907d"). InnerVolumeSpecName "kube-api-access-chn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.744947 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util" (OuterVolumeSpecName: "util") pod "d031b5fc-f2b1-4d17-bbaf-bdee585e907d" (UID: "d031b5fc-f2b1-4d17-bbaf-bdee585e907d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.828737 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.828795 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chn7w\" (UniqueName: \"kubernetes.io/projected/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-kube-api-access-chn7w\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:25 crc kubenswrapper[4832]: I1003 00:23:25.828807 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d031b5fc-f2b1-4d17-bbaf-bdee585e907d-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.288897 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.293985 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt" event={"ID":"d031b5fc-f2b1-4d17-bbaf-bdee585e907d","Type":"ContainerDied","Data":"c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3"} Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.294038 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c88ccd3a258bd805f8c35276b4e0ea9aedba0726a4d210a3357b8a8d0a9a59e3" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.585841 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.740818 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tshfr\" (UniqueName: \"kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr\") pod \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.740943 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util\") pod \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.740993 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle\") pod \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\" (UID: \"d1d45f3c-b296-4f03-99c2-dfa01faf7785\") " Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.741575 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle" (OuterVolumeSpecName: "bundle") pod "d1d45f3c-b296-4f03-99c2-dfa01faf7785" (UID: "d1d45f3c-b296-4f03-99c2-dfa01faf7785"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.744746 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr" (OuterVolumeSpecName: "kube-api-access-tshfr") pod "d1d45f3c-b296-4f03-99c2-dfa01faf7785" (UID: "d1d45f3c-b296-4f03-99c2-dfa01faf7785"). InnerVolumeSpecName "kube-api-access-tshfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.755023 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util" (OuterVolumeSpecName: "util") pod "d1d45f3c-b296-4f03-99c2-dfa01faf7785" (UID: "d1d45f3c-b296-4f03-99c2-dfa01faf7785"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.842363 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.842403 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1d45f3c-b296-4f03-99c2-dfa01faf7785-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:26 crc kubenswrapper[4832]: I1003 00:23:26.842415 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tshfr\" (UniqueName: \"kubernetes.io/projected/d1d45f3c-b296-4f03-99c2-dfa01faf7785-kube-api-access-tshfr\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:27 crc kubenswrapper[4832]: I1003 00:23:27.308983 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" event={"ID":"d1d45f3c-b296-4f03-99c2-dfa01faf7785","Type":"ContainerDied","Data":"3b73fd21587c160c1e9fef228c09a8e6fdb1de6baa5c8885a7869d5095983742"} Oct 03 00:23:27 crc kubenswrapper[4832]: I1003 00:23:27.309036 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b73fd21587c160c1e9fef228c09a8e6fdb1de6baa5c8885a7869d5095983742" Oct 03 00:23:27 crc kubenswrapper[4832]: I1003 00:23:27.309055 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.554107 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8"] Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.555255 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="pull" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.555335 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="pull" Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.555432 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="util" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.555504 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="util" Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.555665 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.555732 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.555812 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="pull" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.555872 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="pull" Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.555959 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.556028 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: E1003 00:23:28.556092 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="util" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.556153 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="util" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.556310 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="d031b5fc-f2b1-4d17-bbaf-bdee585e907d" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.556542 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d45f3c-b296-4f03-99c2-dfa01faf7785" containerName="extract" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.557075 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.559719 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-xtzn6" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.560636 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.562765 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.564084 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.666720 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrt7t\" (UniqueName: \"kubernetes.io/projected/79f70684-a7c1-4ae6-97f5-9cf0d32f472f-kube-api-access-zrt7t\") pod \"obo-prometheus-operator-7c8cf85677-97fx8\" (UID: \"79f70684-a7c1-4ae6-97f5-9cf0d32f472f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.673600 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.674206 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.676540 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.679149 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-lcgcp" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.683231 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.683881 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.702102 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.735395 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.767900 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrt7t\" (UniqueName: \"kubernetes.io/projected/79f70684-a7c1-4ae6-97f5-9cf0d32f472f-kube-api-access-zrt7t\") pod \"obo-prometheus-operator-7c8cf85677-97fx8\" (UID: \"79f70684-a7c1-4ae6-97f5-9cf0d32f472f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.767941 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.767966 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.768002 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.768020 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.800425 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrt7t\" (UniqueName: \"kubernetes.io/projected/79f70684-a7c1-4ae6-97f5-9cf0d32f472f-kube-api-access-zrt7t\") pod \"obo-prometheus-operator-7c8cf85677-97fx8\" (UID: \"79f70684-a7c1-4ae6-97f5-9cf0d32f472f\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.804199 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-482lq"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.804885 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.806533 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.806771 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-d9qm9" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.859086 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-482lq"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.869173 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.869217 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.869288 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.869305 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.872687 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.873384 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0c894cc-8a02-4c23-b822-e4652aa0fde8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-tbzpl\" (UID: \"d0c894cc-8a02-4c23-b822-e4652aa0fde8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.873425 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.873399 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e1c4356-58d2-49a9-aca4-bccb8dac4089-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69f4747886-mg6lg\" (UID: \"0e1c4356-58d2-49a9-aca4-bccb8dac4089\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.875579 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.970927 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6102f2-3843-451a-bcd2-cbb010cb5466-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.970990 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5h9\" (UniqueName: \"kubernetes.io/projected/ac6102f2-3843-451a-bcd2-cbb010cb5466-kube-api-access-bc5h9\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.989403 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-48qqv"] Oct 03 00:23:28 crc kubenswrapper[4832]: I1003 00:23:28.989891 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:28.990769 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:28.992667 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-xk499" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.003331 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.072877 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6102f2-3843-451a-bcd2-cbb010cb5466-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.072935 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5h9\" (UniqueName: \"kubernetes.io/projected/ac6102f2-3843-451a-bcd2-cbb010cb5466-kube-api-access-bc5h9\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.072983 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4268b16-87c1-4af3-aa56-b86e56183f43-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.073014 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6b7\" (UniqueName: \"kubernetes.io/projected/d4268b16-87c1-4af3-aa56-b86e56183f43-kube-api-access-8s6b7\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.074169 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-48qqv"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.078487 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.079677 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.081343 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.083545 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6102f2-3843-451a-bcd2-cbb010cb5466-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.083602 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.097607 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5h9\" (UniqueName: \"kubernetes.io/projected/ac6102f2-3843-451a-bcd2-cbb010cb5466-kube-api-access-bc5h9\") pod \"observability-operator-cc5f78dfc-482lq\" (UID: \"ac6102f2-3843-451a-bcd2-cbb010cb5466\") " pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.141109 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.166921 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.173576 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6b7\" (UniqueName: \"kubernetes.io/projected/d4268b16-87c1-4af3-aa56-b86e56183f43-kube-api-access-8s6b7\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.173643 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gl8t\" (UniqueName: \"kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.173695 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.173740 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.173819 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4268b16-87c1-4af3-aa56-b86e56183f43-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.174799 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4268b16-87c1-4af3-aa56-b86e56183f43-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.206236 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6b7\" (UniqueName: \"kubernetes.io/projected/d4268b16-87c1-4af3-aa56-b86e56183f43-kube-api-access-8s6b7\") pod \"perses-operator-54bc95c9fb-48qqv\" (UID: \"d4268b16-87c1-4af3-aa56-b86e56183f43\") " pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.279319 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gl8t\" (UniqueName: \"kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.279386 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.279422 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.279964 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.280474 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.306626 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gl8t\" (UniqueName: \"kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.326631 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" event={"ID":"79f70684-a7c1-4ae6-97f5-9cf0d32f472f","Type":"ContainerStarted","Data":"7b9291e2224942353930cd607aa2860ed0430cd03cd7aec22ad8adfc8ca84259"} Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.375254 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.403195 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg"] Oct 03 00:23:29 crc kubenswrapper[4832]: W1003 00:23:29.405513 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1c4356_58d2_49a9_aca4_bccb8dac4089.slice/crio-77ddc55769b5cff29328adbd70a061272ed595fc2ad4453fde4b40cb552f8a45 WatchSource:0}: Error finding container 77ddc55769b5cff29328adbd70a061272ed595fc2ad4453fde4b40cb552f8a45: Status 404 returned error can't find the container with id 77ddc55769b5cff29328adbd70a061272ed595fc2ad4453fde4b40cb552f8a45 Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.428169 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.548447 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-482lq"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.654060 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-48qqv"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.681944 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl"] Oct 03 00:23:29 crc kubenswrapper[4832]: I1003 00:23:29.747683 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb"] Oct 03 00:23:29 crc kubenswrapper[4832]: W1003 00:23:29.752448 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb47ed99b_5898_4dc0_908c_b3edb092437a.slice/crio-d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596 WatchSource:0}: Error finding container d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596: Status 404 returned error can't find the container with id d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596 Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.333201 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" event={"ID":"0e1c4356-58d2-49a9-aca4-bccb8dac4089","Type":"ContainerStarted","Data":"77ddc55769b5cff29328adbd70a061272ed595fc2ad4453fde4b40cb552f8a45"} Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.335348 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" event={"ID":"d4268b16-87c1-4af3-aa56-b86e56183f43","Type":"ContainerStarted","Data":"0a1b16f788ecdee8549cd4a46ad1f0cb4a099a44c366b50328a88f4a2b681ac6"} Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.338453 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" event={"ID":"d0c894cc-8a02-4c23-b822-e4652aa0fde8","Type":"ContainerStarted","Data":"436e0d2de644b5580a40a66123060ff64dfaf82775d1ac3ce4960158fe86e6b0"} Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.339873 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" event={"ID":"ac6102f2-3843-451a-bcd2-cbb010cb5466","Type":"ContainerStarted","Data":"4a398d4dfaa2ee78f296f633debccbd1f8dac26aa6fc2e8db93a968ec3cf1ef0"} Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.341269 4832 generic.go:334] "Generic (PLEG): container finished" podID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerID="5c22d582355146c7a916adb4d4b4599cd550f535dc07e32e3c7eaab2c2afdf1a" exitCode=0 Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.341324 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" event={"ID":"b47ed99b-5898-4dc0-908c-b3edb092437a","Type":"ContainerDied","Data":"5c22d582355146c7a916adb4d4b4599cd550f535dc07e32e3c7eaab2c2afdf1a"} Oct 03 00:23:30 crc kubenswrapper[4832]: I1003 00:23:30.341378 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" event={"ID":"b47ed99b-5898-4dc0-908c-b3edb092437a","Type":"ContainerStarted","Data":"d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596"} Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.242626 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xqw6q"] Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.244080 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.245398 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xqw6q"] Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.246111 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.246269 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.247463 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-r7b7p" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.414672 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rn4j\" (UniqueName: \"kubernetes.io/projected/a0a3e834-8d06-41e1-8588-a71be3210ed7-kube-api-access-2rn4j\") pod \"interconnect-operator-5bb49f789d-xqw6q\" (UID: \"a0a3e834-8d06-41e1-8588-a71be3210ed7\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.516314 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rn4j\" (UniqueName: \"kubernetes.io/projected/a0a3e834-8d06-41e1-8588-a71be3210ed7-kube-api-access-2rn4j\") pod \"interconnect-operator-5bb49f789d-xqw6q\" (UID: \"a0a3e834-8d06-41e1-8588-a71be3210ed7\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.548729 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rn4j\" (UniqueName: \"kubernetes.io/projected/a0a3e834-8d06-41e1-8588-a71be3210ed7-kube-api-access-2rn4j\") pod \"interconnect-operator-5bb49f789d-xqw6q\" (UID: \"a0a3e834-8d06-41e1-8588-a71be3210ed7\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" Oct 03 00:23:36 crc kubenswrapper[4832]: I1003 00:23:36.601846 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.195725 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-56fc64664d-k6nmd"] Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.196669 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.199088 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-kp7wd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.199515 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.207404 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-56fc64664d-k6nmd"] Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.353176 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-webhook-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.353305 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-apiservice-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.353337 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kl4\" (UniqueName: \"kubernetes.io/projected/e45dd70b-1389-486c-b186-014c6228beda-kube-api-access-67kl4\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.455150 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-webhook-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.455237 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-apiservice-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.455267 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kl4\" (UniqueName: \"kubernetes.io/projected/e45dd70b-1389-486c-b186-014c6228beda-kube-api-access-67kl4\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.463659 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-webhook-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.471719 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kl4\" (UniqueName: \"kubernetes.io/projected/e45dd70b-1389-486c-b186-014c6228beda-kube-api-access-67kl4\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.477604 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45dd70b-1389-486c-b186-014c6228beda-apiservice-cert\") pod \"elastic-operator-56fc64664d-k6nmd\" (UID: \"e45dd70b-1389-486c-b186-014c6228beda\") " pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:39 crc kubenswrapper[4832]: I1003 00:23:39.514780 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.337064 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-56fc64664d-k6nmd"] Oct 03 00:23:43 crc kubenswrapper[4832]: W1003 00:23:43.348993 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45dd70b_1389_486c_b186_014c6228beda.slice/crio-bfb047d74de2c7beee2f37053c85c4f45cba867aa007a4ed61fd471a74df169b WatchSource:0}: Error finding container bfb047d74de2c7beee2f37053c85c4f45cba867aa007a4ed61fd471a74df169b: Status 404 returned error can't find the container with id bfb047d74de2c7beee2f37053c85c4f45cba867aa007a4ed61fd471a74df169b Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.440811 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xqw6q"] Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.447801 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" event={"ID":"e45dd70b-1389-486c-b186-014c6228beda","Type":"ContainerStarted","Data":"bfb047d74de2c7beee2f37053c85c4f45cba867aa007a4ed61fd471a74df169b"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.449056 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" event={"ID":"79f70684-a7c1-4ae6-97f5-9cf0d32f472f","Type":"ContainerStarted","Data":"a96f2060feb5ca6da1b57dd13b4d7f43ff115d3f720c6c2c01670b62399303a3"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.450699 4832 generic.go:334] "Generic (PLEG): container finished" podID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerID="0891d9ad4e9ef3473807b125621f76e6661d712b2e0892aa22978f527a8b9061" exitCode=0 Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.450796 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" event={"ID":"b47ed99b-5898-4dc0-908c-b3edb092437a","Type":"ContainerDied","Data":"0891d9ad4e9ef3473807b125621f76e6661d712b2e0892aa22978f527a8b9061"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.452815 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" event={"ID":"d4268b16-87c1-4af3-aa56-b86e56183f43","Type":"ContainerStarted","Data":"f872a8dc8dbd52d971ce93104311d385848b69d5303617fc84b1f51da145e6df"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.452899 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.455234 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" event={"ID":"0e1c4356-58d2-49a9-aca4-bccb8dac4089","Type":"ContainerStarted","Data":"050c63a99daec1e8902cda58d80cc11a8af800879456f030e1eeb0035fd65909"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.458175 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" event={"ID":"d0c894cc-8a02-4c23-b822-e4652aa0fde8","Type":"ContainerStarted","Data":"3de90c2d27186f3667489921061d2fce518cfb55996fe8869b40d9a522c2869e"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.459212 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" event={"ID":"ac6102f2-3843-451a-bcd2-cbb010cb5466","Type":"ContainerStarted","Data":"ea734d83906196cb1ee7659eef72decca193d72074e305dd89aa2646c338c3f2"} Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.459412 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:43 crc kubenswrapper[4832]: W1003 00:23:43.470492 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a3e834_8d06_41e1_8588_a71be3210ed7.slice/crio-b5c43116d4bae55b05e528b9b4bac3f708a996eb277b108ae73d4f8e5f413008 WatchSource:0}: Error finding container b5c43116d4bae55b05e528b9b4bac3f708a996eb277b108ae73d4f8e5f413008: Status 404 returned error can't find the container with id b5c43116d4bae55b05e528b9b4bac3f708a996eb277b108ae73d4f8e5f413008 Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.499327 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.501877 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-tbzpl" podStartSLOduration=2.188505429 podStartE2EDuration="15.501858671s" podCreationTimestamp="2025-10-03 00:23:28 +0000 UTC" firstStartedPulling="2025-10-03 00:23:29.700660121 +0000 UTC m=+1002.962815227" lastFinishedPulling="2025-10-03 00:23:43.014013363 +0000 UTC m=+1016.276168469" observedRunningTime="2025-10-03 00:23:43.497853736 +0000 UTC m=+1016.760008862" watchObservedRunningTime="2025-10-03 00:23:43.501858671 +0000 UTC m=+1016.764013787" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.502401 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-97fx8" podStartSLOduration=1.679264302 podStartE2EDuration="15.502394948s" podCreationTimestamp="2025-10-03 00:23:28 +0000 UTC" firstStartedPulling="2025-10-03 00:23:29.206245497 +0000 UTC m=+1002.468400603" lastFinishedPulling="2025-10-03 00:23:43.029376143 +0000 UTC m=+1016.291531249" observedRunningTime="2025-10-03 00:23:43.474113344 +0000 UTC m=+1016.736268450" watchObservedRunningTime="2025-10-03 00:23:43.502394948 +0000 UTC m=+1016.764550064" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.523255 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-482lq" podStartSLOduration=2.019973385 podStartE2EDuration="15.523236149s" podCreationTimestamp="2025-10-03 00:23:28 +0000 UTC" firstStartedPulling="2025-10-03 00:23:29.592006157 +0000 UTC m=+1002.854161263" lastFinishedPulling="2025-10-03 00:23:43.095268921 +0000 UTC m=+1016.357424027" observedRunningTime="2025-10-03 00:23:43.519635277 +0000 UTC m=+1016.781790383" watchObservedRunningTime="2025-10-03 00:23:43.523236149 +0000 UTC m=+1016.785391255" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.574917 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69f4747886-mg6lg" podStartSLOduration=1.898098737 podStartE2EDuration="15.574898382s" podCreationTimestamp="2025-10-03 00:23:28 +0000 UTC" firstStartedPulling="2025-10-03 00:23:29.410725444 +0000 UTC m=+1002.672880550" lastFinishedPulling="2025-10-03 00:23:43.087525089 +0000 UTC m=+1016.349680195" observedRunningTime="2025-10-03 00:23:43.573901981 +0000 UTC m=+1016.836057087" watchObservedRunningTime="2025-10-03 00:23:43.574898382 +0000 UTC m=+1016.837053488" Oct 03 00:23:43 crc kubenswrapper[4832]: I1003 00:23:43.575068 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" podStartSLOduration=2.217997241 podStartE2EDuration="15.575064308s" podCreationTimestamp="2025-10-03 00:23:28 +0000 UTC" firstStartedPulling="2025-10-03 00:23:29.673501023 +0000 UTC m=+1002.935656129" lastFinishedPulling="2025-10-03 00:23:43.03056809 +0000 UTC m=+1016.292723196" observedRunningTime="2025-10-03 00:23:43.560158712 +0000 UTC m=+1016.822313838" watchObservedRunningTime="2025-10-03 00:23:43.575064308 +0000 UTC m=+1016.837219414" Oct 03 00:23:44 crc kubenswrapper[4832]: I1003 00:23:44.466811 4832 generic.go:334] "Generic (PLEG): container finished" podID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerID="5afbe37e6cad12a5d6126518fe35379401c9d938f1103fa31307e906db456086" exitCode=0 Oct 03 00:23:44 crc kubenswrapper[4832]: I1003 00:23:44.466850 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" event={"ID":"b47ed99b-5898-4dc0-908c-b3edb092437a","Type":"ContainerDied","Data":"5afbe37e6cad12a5d6126518fe35379401c9d938f1103fa31307e906db456086"} Oct 03 00:23:44 crc kubenswrapper[4832]: I1003 00:23:44.468274 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" event={"ID":"a0a3e834-8d06-41e1-8588-a71be3210ed7","Type":"ContainerStarted","Data":"b5c43116d4bae55b05e528b9b4bac3f708a996eb277b108ae73d4f8e5f413008"} Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.920284 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.942382 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util\") pod \"b47ed99b-5898-4dc0-908c-b3edb092437a\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.942489 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle\") pod \"b47ed99b-5898-4dc0-908c-b3edb092437a\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.942532 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gl8t\" (UniqueName: \"kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t\") pod \"b47ed99b-5898-4dc0-908c-b3edb092437a\" (UID: \"b47ed99b-5898-4dc0-908c-b3edb092437a\") " Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.944150 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle" (OuterVolumeSpecName: "bundle") pod "b47ed99b-5898-4dc0-908c-b3edb092437a" (UID: "b47ed99b-5898-4dc0-908c-b3edb092437a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.952474 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t" (OuterVolumeSpecName: "kube-api-access-2gl8t") pod "b47ed99b-5898-4dc0-908c-b3edb092437a" (UID: "b47ed99b-5898-4dc0-908c-b3edb092437a"). InnerVolumeSpecName "kube-api-access-2gl8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:23:45 crc kubenswrapper[4832]: I1003 00:23:45.987145 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util" (OuterVolumeSpecName: "util") pod "b47ed99b-5898-4dc0-908c-b3edb092437a" (UID: "b47ed99b-5898-4dc0-908c-b3edb092437a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.050474 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.050513 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b47ed99b-5898-4dc0-908c-b3edb092437a-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.050525 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gl8t\" (UniqueName: \"kubernetes.io/projected/b47ed99b-5898-4dc0-908c-b3edb092437a-kube-api-access-2gl8t\") on node \"crc\" DevicePath \"\"" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.510340 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" event={"ID":"e45dd70b-1389-486c-b186-014c6228beda","Type":"ContainerStarted","Data":"bec8e1c47579b2eae807b73c50daab6c75efe3396af3b67b470025f48eadbfe1"} Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.516617 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" event={"ID":"b47ed99b-5898-4dc0-908c-b3edb092437a","Type":"ContainerDied","Data":"d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596"} Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.516660 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d143fd4a9a6f3e763c8a755ef2d4e690995159ec6c57d07305e55cd89a440596" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.516708 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb" Oct 03 00:23:46 crc kubenswrapper[4832]: I1003 00:23:46.561583 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-56fc64664d-k6nmd" podStartSLOduration=4.990912117 podStartE2EDuration="7.561566721s" podCreationTimestamp="2025-10-03 00:23:39 +0000 UTC" firstStartedPulling="2025-10-03 00:23:43.351966029 +0000 UTC m=+1016.614121135" lastFinishedPulling="2025-10-03 00:23:45.922620633 +0000 UTC m=+1019.184775739" observedRunningTime="2025-10-03 00:23:46.55993862 +0000 UTC m=+1019.822093726" watchObservedRunningTime="2025-10-03 00:23:46.561566721 +0000 UTC m=+1019.823721827" Oct 03 00:23:49 crc kubenswrapper[4832]: I1003 00:23:49.379012 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-48qqv" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.178770 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Oct 03 00:23:52 crc kubenswrapper[4832]: E1003 00:23:52.179487 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="pull" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.179500 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="pull" Oct 03 00:23:52 crc kubenswrapper[4832]: E1003 00:23:52.179510 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="extract" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.179516 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="extract" Oct 03 00:23:52 crc kubenswrapper[4832]: E1003 00:23:52.179526 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="util" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.179532 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="util" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.179624 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47ed99b-5898-4dc0-908c-b3edb092437a" containerName="extract" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.180373 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.182841 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.183061 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.183188 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.183347 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.184289 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-d8zlk" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.185056 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.185280 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.185469 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.186510 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.209105 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236144 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236216 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236249 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236293 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236418 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236469 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236526 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236584 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236631 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236664 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236690 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236721 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236798 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/43340dac-0a14-49ad-b56c-22a735ee0273-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236898 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.236970 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338327 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338372 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338391 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338421 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338443 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/43340dac-0a14-49ad-b56c-22a735ee0273-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338476 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338491 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338516 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338541 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338560 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338585 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338604 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338626 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338651 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.338675 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339198 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339580 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339768 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339828 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339876 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.339979 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.340514 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.340839 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.345811 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/43340dac-0a14-49ad-b56c-22a735ee0273-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.346082 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.347366 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.347539 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.348702 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.354554 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.372170 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/43340dac-0a14-49ad-b56c-22a735ee0273-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"43340dac-0a14-49ad-b56c-22a735ee0273\") " pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.498731 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.567851 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" event={"ID":"a0a3e834-8d06-41e1-8588-a71be3210ed7","Type":"ContainerStarted","Data":"5c318730607eb6dcc5e6753cafd0da67861b74f87d5a732b7650fa1d118b393a"} Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.592640 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-xqw6q" podStartSLOduration=8.371687581 podStartE2EDuration="16.592619451s" podCreationTimestamp="2025-10-03 00:23:36 +0000 UTC" firstStartedPulling="2025-10-03 00:23:43.475113706 +0000 UTC m=+1016.737268812" lastFinishedPulling="2025-10-03 00:23:51.696045576 +0000 UTC m=+1024.958200682" observedRunningTime="2025-10-03 00:23:52.591984801 +0000 UTC m=+1025.854139907" watchObservedRunningTime="2025-10-03 00:23:52.592619451 +0000 UTC m=+1025.854774557" Oct 03 00:23:52 crc kubenswrapper[4832]: I1003 00:23:52.891379 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Oct 03 00:23:52 crc kubenswrapper[4832]: W1003 00:23:52.901159 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43340dac_0a14_49ad_b56c_22a735ee0273.slice/crio-4e61413b3478a796af22ff3a357c272ad45d712c45a74fe1ba28e70ec15e9a4c WatchSource:0}: Error finding container 4e61413b3478a796af22ff3a357c272ad45d712c45a74fe1ba28e70ec15e9a4c: Status 404 returned error can't find the container with id 4e61413b3478a796af22ff3a357c272ad45d712c45a74fe1ba28e70ec15e9a4c Oct 03 00:23:53 crc kubenswrapper[4832]: I1003 00:23:53.573868 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"43340dac-0a14-49ad-b56c-22a735ee0273","Type":"ContainerStarted","Data":"4e61413b3478a796af22ff3a357c272ad45d712c45a74fe1ba28e70ec15e9a4c"} Oct 03 00:24:10 crc kubenswrapper[4832]: E1003 00:24:10.761066 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Oct 03 00:24:10 crc kubenswrapper[4832]: E1003 00:24:10.761786 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(43340dac-0a14-49ad-b56c-22a735ee0273): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 00:24:10 crc kubenswrapper[4832]: E1003 00:24:10.763141 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="43340dac-0a14-49ad-b56c-22a735ee0273" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.547009 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz"] Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.549989 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.551688 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.551701 4832 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-ljrmm" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.556581 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.570192 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz"] Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.594382 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjmxq\" (UniqueName: \"kubernetes.io/projected/68a69ef0-e792-43fa-a06c-be46eac25a9e-kube-api-access-tjmxq\") pod \"cert-manager-operator-controller-manager-96d66748b-xdkkz\" (UID: \"68a69ef0-e792-43fa-a06c-be46eac25a9e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" Oct 03 00:24:11 crc kubenswrapper[4832]: E1003 00:24:11.685030 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="43340dac-0a14-49ad-b56c-22a735ee0273" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.695594 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjmxq\" (UniqueName: \"kubernetes.io/projected/68a69ef0-e792-43fa-a06c-be46eac25a9e-kube-api-access-tjmxq\") pod \"cert-manager-operator-controller-manager-96d66748b-xdkkz\" (UID: \"68a69ef0-e792-43fa-a06c-be46eac25a9e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.717210 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjmxq\" (UniqueName: \"kubernetes.io/projected/68a69ef0-e792-43fa-a06c-be46eac25a9e-kube-api-access-tjmxq\") pod \"cert-manager-operator-controller-manager-96d66748b-xdkkz\" (UID: \"68a69ef0-e792-43fa-a06c-be46eac25a9e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.836255 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.868142 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Oct 03 00:24:11 crc kubenswrapper[4832]: I1003 00:24:11.893972 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" Oct 03 00:24:12 crc kubenswrapper[4832]: I1003 00:24:12.094706 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz"] Oct 03 00:24:12 crc kubenswrapper[4832]: W1003 00:24:12.104908 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a69ef0_e792_43fa_a06c_be46eac25a9e.slice/crio-ecd8bd5b37cd0379d5d0cba1b73d190576f7b5de61b70f7f39fa2bfd21066a0d WatchSource:0}: Error finding container ecd8bd5b37cd0379d5d0cba1b73d190576f7b5de61b70f7f39fa2bfd21066a0d: Status 404 returned error can't find the container with id ecd8bd5b37cd0379d5d0cba1b73d190576f7b5de61b70f7f39fa2bfd21066a0d Oct 03 00:24:12 crc kubenswrapper[4832]: I1003 00:24:12.689823 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" event={"ID":"68a69ef0-e792-43fa-a06c-be46eac25a9e","Type":"ContainerStarted","Data":"ecd8bd5b37cd0379d5d0cba1b73d190576f7b5de61b70f7f39fa2bfd21066a0d"} Oct 03 00:24:12 crc kubenswrapper[4832]: E1003 00:24:12.691452 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="43340dac-0a14-49ad-b56c-22a735ee0273" Oct 03 00:24:13 crc kubenswrapper[4832]: E1003 00:24:13.696267 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="43340dac-0a14-49ad-b56c-22a735ee0273" Oct 03 00:24:15 crc kubenswrapper[4832]: I1003 00:24:15.707188 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" event={"ID":"68a69ef0-e792-43fa-a06c-be46eac25a9e","Type":"ContainerStarted","Data":"b42e97ad97a3603fd9a8712f0ca883c5ba955c3b1d7c31d8c4d67f1d66c4a90f"} Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.099377 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-96d66748b-xdkkz" podStartSLOduration=5.019616994 podStartE2EDuration="8.099361779s" podCreationTimestamp="2025-10-03 00:24:11 +0000 UTC" firstStartedPulling="2025-10-03 00:24:12.106709164 +0000 UTC m=+1045.368864270" lastFinishedPulling="2025-10-03 00:24:15.186453949 +0000 UTC m=+1048.448609055" observedRunningTime="2025-10-03 00:24:15.728466309 +0000 UTC m=+1048.990621415" watchObservedRunningTime="2025-10-03 00:24:19.099361779 +0000 UTC m=+1052.361516885" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.102983 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tbgqj"] Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.103721 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.105854 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.106394 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.112974 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tbgqj"] Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.119314 4832 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ktdq9" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.181786 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gdh\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-kube-api-access-x7gdh\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.181863 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.283352 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.283455 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gdh\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-kube-api-access-x7gdh\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.301831 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.304955 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gdh\" (UniqueName: \"kubernetes.io/projected/270b08f1-756c-4eb7-b0bc-59cefba57eb1-kube-api-access-x7gdh\") pod \"cert-manager-webhook-d969966f-tbgqj\" (UID: \"270b08f1-756c-4eb7-b0bc-59cefba57eb1\") " pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.420380 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.585938 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc"] Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.588142 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.593541 4832 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-fh8q9" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.594332 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc"] Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.646910 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tbgqj"] Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.690211 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.690254 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jclr\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-kube-api-access-9jclr\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.737360 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" event={"ID":"270b08f1-756c-4eb7-b0bc-59cefba57eb1","Type":"ContainerStarted","Data":"0fcb95732c6aaefe6d13dd8caac588efe12c73c6ea09193036d36ad156b68884"} Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.791506 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.791548 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jclr\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-kube-api-access-9jclr\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.809160 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.809442 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jclr\" (UniqueName: \"kubernetes.io/projected/45a1b7f0-a950-421d-978a-651d666f9399-kube-api-access-9jclr\") pod \"cert-manager-cainjector-7d9f95dbf-gd7wc\" (UID: \"45a1b7f0-a950-421d-978a-651d666f9399\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:19 crc kubenswrapper[4832]: I1003 00:24:19.915438 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" Oct 03 00:24:20 crc kubenswrapper[4832]: I1003 00:24:20.295503 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc"] Oct 03 00:24:20 crc kubenswrapper[4832]: I1003 00:24:20.743599 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" event={"ID":"45a1b7f0-a950-421d-978a-651d666f9399","Type":"ContainerStarted","Data":"90083a13cb5e2151e42a861ec7319894f4a9edf43476ea4d4e05ae4856b13d34"} Oct 03 00:24:24 crc kubenswrapper[4832]: I1003 00:24:24.765414 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" event={"ID":"270b08f1-756c-4eb7-b0bc-59cefba57eb1","Type":"ContainerStarted","Data":"a476d7b0ede0123f1dcfd605bc8945cc4b6e6130dcd0c137cfbe52f15e555edb"} Oct 03 00:24:24 crc kubenswrapper[4832]: I1003 00:24:24.766032 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:24 crc kubenswrapper[4832]: I1003 00:24:24.767574 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" event={"ID":"45a1b7f0-a950-421d-978a-651d666f9399","Type":"ContainerStarted","Data":"d25833a5af541fe8ee930a3b4f063277a0b77bb72819429f74c17d85626113ea"} Oct 03 00:24:24 crc kubenswrapper[4832]: I1003 00:24:24.780948 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" podStartSLOduration=1.560239818 podStartE2EDuration="5.780926152s" podCreationTimestamp="2025-10-03 00:24:19 +0000 UTC" firstStartedPulling="2025-10-03 00:24:19.655535371 +0000 UTC m=+1052.917690477" lastFinishedPulling="2025-10-03 00:24:23.876221705 +0000 UTC m=+1057.138376811" observedRunningTime="2025-10-03 00:24:24.77890997 +0000 UTC m=+1058.041065086" watchObservedRunningTime="2025-10-03 00:24:24.780926152 +0000 UTC m=+1058.043081258" Oct 03 00:24:24 crc kubenswrapper[4832]: I1003 00:24:24.794857 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-gd7wc" podStartSLOduration=2.20539232 podStartE2EDuration="5.794832457s" podCreationTimestamp="2025-10-03 00:24:19 +0000 UTC" firstStartedPulling="2025-10-03 00:24:20.300642651 +0000 UTC m=+1053.562797757" lastFinishedPulling="2025-10-03 00:24:23.890082788 +0000 UTC m=+1057.152237894" observedRunningTime="2025-10-03 00:24:24.791127242 +0000 UTC m=+1058.053282348" watchObservedRunningTime="2025-10-03 00:24:24.794832457 +0000 UTC m=+1058.056987563" Oct 03 00:24:26 crc kubenswrapper[4832]: I1003 00:24:26.396807 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:24:26 crc kubenswrapper[4832]: I1003 00:24:26.396879 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:24:28 crc kubenswrapper[4832]: I1003 00:24:28.790500 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"43340dac-0a14-49ad-b56c-22a735ee0273","Type":"ContainerStarted","Data":"b0856e6b0930d16e5284d4f8197b64efcd63be50dcbc9a8d63fc506557fce328"} Oct 03 00:24:29 crc kubenswrapper[4832]: I1003 00:24:29.425457 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-tbgqj" Oct 03 00:24:29 crc kubenswrapper[4832]: I1003 00:24:29.798033 4832 generic.go:334] "Generic (PLEG): container finished" podID="43340dac-0a14-49ad-b56c-22a735ee0273" containerID="b0856e6b0930d16e5284d4f8197b64efcd63be50dcbc9a8d63fc506557fce328" exitCode=0 Oct 03 00:24:29 crc kubenswrapper[4832]: I1003 00:24:29.798117 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"43340dac-0a14-49ad-b56c-22a735ee0273","Type":"ContainerDied","Data":"b0856e6b0930d16e5284d4f8197b64efcd63be50dcbc9a8d63fc506557fce328"} Oct 03 00:24:30 crc kubenswrapper[4832]: I1003 00:24:30.810154 4832 generic.go:334] "Generic (PLEG): container finished" podID="43340dac-0a14-49ad-b56c-22a735ee0273" containerID="c08af87f50902445bd5bacd1aefc347f323a51e7c20c25996a626cdbc71553b2" exitCode=0 Oct 03 00:24:30 crc kubenswrapper[4832]: I1003 00:24:30.810227 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"43340dac-0a14-49ad-b56c-22a735ee0273","Type":"ContainerDied","Data":"c08af87f50902445bd5bacd1aefc347f323a51e7c20c25996a626cdbc71553b2"} Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.376879 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.377832 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.379576 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.379583 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.379626 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.379767 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.398039 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.498966 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499044 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499070 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499095 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499118 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499139 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499159 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499185 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499241 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499269 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499293 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.499321 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdzn\" (UniqueName: \"kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.600831 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.600888 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.600916 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.600943 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.600972 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601004 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601027 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601358 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601579 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdzn\" (UniqueName: \"kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601522 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601453 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.601606 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602118 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602210 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602223 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602343 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602393 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.602632 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.603014 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.603073 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.603134 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.607096 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.607524 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.626602 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdzn\" (UniqueName: \"kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn\") pod \"service-telemetry-operator-1-build\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.693622 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.823953 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"43340dac-0a14-49ad-b56c-22a735ee0273","Type":"ContainerStarted","Data":"50889813300f67a5c09cb910c605fd0f4b1f7cb4d5fbed416f536e7a6570ae4e"} Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.824192 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.895316 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=4.825518035 podStartE2EDuration="39.89529748s" podCreationTimestamp="2025-10-03 00:23:52 +0000 UTC" firstStartedPulling="2025-10-03 00:23:52.903379037 +0000 UTC m=+1026.165534143" lastFinishedPulling="2025-10-03 00:24:27.973158482 +0000 UTC m=+1061.235313588" observedRunningTime="2025-10-03 00:24:31.885624108 +0000 UTC m=+1065.147779214" watchObservedRunningTime="2025-10-03 00:24:31.89529748 +0000 UTC m=+1065.157452586" Oct 03 00:24:31 crc kubenswrapper[4832]: I1003 00:24:31.937724 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:31 crc kubenswrapper[4832]: W1003 00:24:31.943591 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09dbda77_0b4d_45c5_8c13_a82c96c8b50b.slice/crio-b5a77ede6e4e3091f8b57a68fecfc3dd73544f4d33ce1ad9d0589f32f300f5af WatchSource:0}: Error finding container b5a77ede6e4e3091f8b57a68fecfc3dd73544f4d33ce1ad9d0589f32f300f5af: Status 404 returned error can't find the container with id b5a77ede6e4e3091f8b57a68fecfc3dd73544f4d33ce1ad9d0589f32f300f5af Oct 03 00:24:32 crc kubenswrapper[4832]: I1003 00:24:32.831585 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"09dbda77-0b4d-45c5-8c13-a82c96c8b50b","Type":"ContainerStarted","Data":"b5a77ede6e4e3091f8b57a68fecfc3dd73544f4d33ce1ad9d0589f32f300f5af"} Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.863197 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p6wlh"] Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.864156 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.867294 4832 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-p2bkx" Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.876037 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p6wlh"] Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.990566 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvql\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-kube-api-access-gwvql\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:37 crc kubenswrapper[4832]: I1003 00:24:37.990636 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:38 crc kubenswrapper[4832]: I1003 00:24:38.092118 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvql\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-kube-api-access-gwvql\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:38 crc kubenswrapper[4832]: I1003 00:24:38.092224 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:38 crc kubenswrapper[4832]: I1003 00:24:38.108688 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:38 crc kubenswrapper[4832]: I1003 00:24:38.121588 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvql\" (UniqueName: \"kubernetes.io/projected/88ee0f88-75e2-46d8-ba6e-fc853e8cdea4-kube-api-access-gwvql\") pod \"cert-manager-7d4cc89fcb-p6wlh\" (UID: \"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4\") " pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:38 crc kubenswrapper[4832]: I1003 00:24:38.189058 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.118245 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-p6wlh"] Oct 03 00:24:39 crc kubenswrapper[4832]: W1003 00:24:39.122068 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ee0f88_75e2_46d8_ba6e_fc853e8cdea4.slice/crio-da4ae32e42c600629ba35883f1883c4288f60562915ab760abfc66cfc360021e WatchSource:0}: Error finding container da4ae32e42c600629ba35883f1883c4288f60562915ab760abfc66cfc360021e: Status 404 returned error can't find the container with id da4ae32e42c600629ba35883f1883c4288f60562915ab760abfc66cfc360021e Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.867800 4832 generic.go:334] "Generic (PLEG): container finished" podID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerID="e6ccfa05f5aeff6be4ed27162ffb7560d3561d996797c769ba96c732e122fab6" exitCode=0 Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.867854 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"09dbda77-0b4d-45c5-8c13-a82c96c8b50b","Type":"ContainerDied","Data":"e6ccfa05f5aeff6be4ed27162ffb7560d3561d996797c769ba96c732e122fab6"} Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.869186 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" event={"ID":"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4","Type":"ContainerStarted","Data":"ecacc3612803fd5439492f3a27aa0cf0eb8fd3d62118b7145a9361803c97f1f1"} Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.869233 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" event={"ID":"88ee0f88-75e2-46d8-ba6e-fc853e8cdea4","Type":"ContainerStarted","Data":"da4ae32e42c600629ba35883f1883c4288f60562915ab760abfc66cfc360021e"} Oct 03 00:24:39 crc kubenswrapper[4832]: I1003 00:24:39.913511 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-p6wlh" podStartSLOduration=2.913492648 podStartE2EDuration="2.913492648s" podCreationTimestamp="2025-10-03 00:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:24:39.912150987 +0000 UTC m=+1073.174306103" watchObservedRunningTime="2025-10-03 00:24:39.913492648 +0000 UTC m=+1073.175647754" Oct 03 00:24:40 crc kubenswrapper[4832]: I1003 00:24:40.877459 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"09dbda77-0b4d-45c5-8c13-a82c96c8b50b","Type":"ContainerStarted","Data":"491043747d24f670f05f37a755c5575160c08d85179f1352914ee6b89f736640"} Oct 03 00:24:40 crc kubenswrapper[4832]: I1003 00:24:40.905126 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.0719750279999998 podStartE2EDuration="9.905106491s" podCreationTimestamp="2025-10-03 00:24:31 +0000 UTC" firstStartedPulling="2025-10-03 00:24:31.946145958 +0000 UTC m=+1065.208301064" lastFinishedPulling="2025-10-03 00:24:38.779277421 +0000 UTC m=+1072.041432527" observedRunningTime="2025-10-03 00:24:40.898409532 +0000 UTC m=+1074.160564658" watchObservedRunningTime="2025-10-03 00:24:40.905106491 +0000 UTC m=+1074.167261607" Oct 03 00:24:41 crc kubenswrapper[4832]: I1003 00:24:41.914331 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:42 crc kubenswrapper[4832]: I1003 00:24:42.603028 4832 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="43340dac-0a14-49ad-b56c-22a735ee0273" containerName="elasticsearch" probeResult="failure" output=< Oct 03 00:24:42 crc kubenswrapper[4832]: {"timestamp": "2025-10-03T00:24:42+00:00", "message": "readiness probe failed", "curl_rc": "7"} Oct 03 00:24:42 crc kubenswrapper[4832]: > Oct 03 00:24:42 crc kubenswrapper[4832]: I1003 00:24:42.887917 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="docker-build" containerID="cri-o://491043747d24f670f05f37a755c5575160c08d85179f1352914ee6b89f736640" gracePeriod=30 Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.485460 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.486810 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.489430 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.489615 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.489776 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.506812 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.562876 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.562922 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.562965 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.562981 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fvc\" (UniqueName: \"kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563005 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563023 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563044 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563060 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563080 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563102 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563210 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.563298 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.664630 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665200 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665340 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665544 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665422 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665776 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665813 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5fvc\" (UniqueName: \"kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665858 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665884 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665916 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665938 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.665979 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666034 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666045 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666083 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666257 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666368 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666441 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666577 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666570 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.666933 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.671902 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.672228 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.681950 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5fvc\" (UniqueName: \"kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc\") pod \"service-telemetry-operator-2-build\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:43 crc kubenswrapper[4832]: I1003 00:24:43.807698 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.222505 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Oct 03 00:24:44 crc kubenswrapper[4832]: W1003 00:24:44.228749 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0837274_ab99_4217_840d_56315121e361.slice/crio-f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408 WatchSource:0}: Error finding container f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408: Status 404 returned error can't find the container with id f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408 Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.899897 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_09dbda77-0b4d-45c5-8c13-a82c96c8b50b/docker-build/0.log" Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.900715 4832 generic.go:334] "Generic (PLEG): container finished" podID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerID="491043747d24f670f05f37a755c5575160c08d85179f1352914ee6b89f736640" exitCode=1 Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.900810 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"09dbda77-0b4d-45c5-8c13-a82c96c8b50b","Type":"ContainerDied","Data":"491043747d24f670f05f37a755c5575160c08d85179f1352914ee6b89f736640"} Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.902483 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerStarted","Data":"bb540cdd8dd224b1a73b56f72e4aa51ad03fb0504aa6957e899ffaf6ed556f70"} Oct 03 00:24:44 crc kubenswrapper[4832]: I1003 00:24:44.902527 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerStarted","Data":"f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408"} Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.058793 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_09dbda77-0b4d-45c5-8c13-a82c96c8b50b/docker-build/0.log" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.059179 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.187941 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188006 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188088 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188111 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188132 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188172 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxdzn\" (UniqueName: \"kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188186 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188222 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188240 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188201 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188272 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188293 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188313 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles\") pod \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\" (UID: \"09dbda77-0b4d-45c5-8c13-a82c96c8b50b\") " Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188528 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188641 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188684 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.188883 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.190367 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.190400 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.193901 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.194178 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.194410 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn" (OuterVolumeSpecName: "kube-api-access-qxdzn") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "kube-api-access-qxdzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.194560 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.196886 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.198082 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "09dbda77-0b4d-45c5-8c13-a82c96c8b50b" (UID: "09dbda77-0b4d-45c5-8c13-a82c96c8b50b"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290106 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxdzn\" (UniqueName: \"kubernetes.io/projected/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-kube-api-access-qxdzn\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290145 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290160 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290172 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290184 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290199 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290212 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290225 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290239 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290253 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.290266 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/09dbda77-0b4d-45c5-8c13-a82c96c8b50b-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.922942 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_09dbda77-0b4d-45c5-8c13-a82c96c8b50b/docker-build/0.log" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.923871 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"09dbda77-0b4d-45c5-8c13-a82c96c8b50b","Type":"ContainerDied","Data":"b5a77ede6e4e3091f8b57a68fecfc3dd73544f4d33ce1ad9d0589f32f300f5af"} Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.923932 4832 scope.go:117] "RemoveContainer" containerID="491043747d24f670f05f37a755c5575160c08d85179f1352914ee6b89f736640" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.924286 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.945554 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.950583 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Oct 03 00:24:45 crc kubenswrapper[4832]: I1003 00:24:45.952107 4832 scope.go:117] "RemoveContainer" containerID="e6ccfa05f5aeff6be4ed27162ffb7560d3561d996797c769ba96c732e122fab6" Oct 03 00:24:47 crc kubenswrapper[4832]: I1003 00:24:47.395645 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" path="/var/lib/kubelet/pods/09dbda77-0b4d-45c5-8c13-a82c96c8b50b/volumes" Oct 03 00:24:47 crc kubenswrapper[4832]: I1003 00:24:47.922435 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Oct 03 00:24:51 crc kubenswrapper[4832]: I1003 00:24:51.987192 4832 generic.go:334] "Generic (PLEG): container finished" podID="f0837274-ab99-4217-840d-56315121e361" containerID="bb540cdd8dd224b1a73b56f72e4aa51ad03fb0504aa6957e899ffaf6ed556f70" exitCode=0 Oct 03 00:24:51 crc kubenswrapper[4832]: I1003 00:24:51.987271 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerDied","Data":"bb540cdd8dd224b1a73b56f72e4aa51ad03fb0504aa6957e899ffaf6ed556f70"} Oct 03 00:24:52 crc kubenswrapper[4832]: I1003 00:24:52.995475 4832 generic.go:334] "Generic (PLEG): container finished" podID="f0837274-ab99-4217-840d-56315121e361" containerID="54f6106a09c4b3091b64a4f4de4c967952afcc20925ce136de3fcc80a7bf217a" exitCode=0 Oct 03 00:24:52 crc kubenswrapper[4832]: I1003 00:24:52.995538 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerDied","Data":"54f6106a09c4b3091b64a4f4de4c967952afcc20925ce136de3fcc80a7bf217a"} Oct 03 00:24:53 crc kubenswrapper[4832]: I1003 00:24:53.029926 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_f0837274-ab99-4217-840d-56315121e361/manage-dockerfile/0.log" Oct 03 00:24:54 crc kubenswrapper[4832]: I1003 00:24:54.003924 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerStarted","Data":"33a39fd17b58ee379ef6af3a4759d8df0a362f32b6bc42a46f1d006e01a30bb4"} Oct 03 00:24:54 crc kubenswrapper[4832]: I1003 00:24:54.040078 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=11.040060571 podStartE2EDuration="11.040060571s" podCreationTimestamp="2025-10-03 00:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:24:54.036706956 +0000 UTC m=+1087.298862072" watchObservedRunningTime="2025-10-03 00:24:54.040060571 +0000 UTC m=+1087.302215677" Oct 03 00:24:56 crc kubenswrapper[4832]: I1003 00:24:56.397027 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:24:56 crc kubenswrapper[4832]: I1003 00:24:56.397092 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:25:26 crc kubenswrapper[4832]: I1003 00:25:26.397461 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:25:26 crc kubenswrapper[4832]: I1003 00:25:26.399792 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:25:26 crc kubenswrapper[4832]: I1003 00:25:26.400064 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:25:26 crc kubenswrapper[4832]: I1003 00:25:26.401047 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:25:26 crc kubenswrapper[4832]: I1003 00:25:26.401336 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0" gracePeriod=600 Oct 03 00:25:27 crc kubenswrapper[4832]: I1003 00:25:27.184431 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0" exitCode=0 Oct 03 00:25:27 crc kubenswrapper[4832]: I1003 00:25:27.184470 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0"} Oct 03 00:25:27 crc kubenswrapper[4832]: I1003 00:25:27.184777 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30"} Oct 03 00:25:27 crc kubenswrapper[4832]: I1003 00:25:27.184798 4832 scope.go:117] "RemoveContainer" containerID="fe714a41dbfd3df82e68d6952db5280812ff6c66e0e98d29ae26f7117a7c4a98" Oct 03 00:26:37 crc kubenswrapper[4832]: I1003 00:26:37.578695 4832 generic.go:334] "Generic (PLEG): container finished" podID="f0837274-ab99-4217-840d-56315121e361" containerID="33a39fd17b58ee379ef6af3a4759d8df0a362f32b6bc42a46f1d006e01a30bb4" exitCode=0 Oct 03 00:26:37 crc kubenswrapper[4832]: I1003 00:26:37.578778 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerDied","Data":"33a39fd17b58ee379ef6af3a4759d8df0a362f32b6bc42a46f1d006e01a30bb4"} Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.815854 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.849961 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.850469 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.850983 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5fvc\" (UniqueName: \"kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.851109 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.851355 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854584 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.851348 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854650 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854680 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854696 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854728 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854748 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854749 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.854790 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets\") pod \"f0837274-ab99-4217-840d-56315121e361\" (UID: \"f0837274-ab99-4217-840d-56315121e361\") " Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855119 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855132 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855161 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855206 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855317 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.855874 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.884017 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.892863 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc" (OuterVolumeSpecName: "kube-api-access-f5fvc") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "kube-api-access-f5fvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.893250 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.895514 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956500 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956535 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5fvc\" (UniqueName: \"kubernetes.io/projected/f0837274-ab99-4217-840d-56315121e361-kube-api-access-f5fvc\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956546 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956555 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956566 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956575 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f0837274-ab99-4217-840d-56315121e361-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956584 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/f0837274-ab99-4217-840d-56315121e361-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:38 crc kubenswrapper[4832]: I1003 00:26:38.956592 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0837274-ab99-4217-840d-56315121e361-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:39 crc kubenswrapper[4832]: I1003 00:26:39.061966 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:39 crc kubenswrapper[4832]: I1003 00:26:39.158192 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:39 crc kubenswrapper[4832]: I1003 00:26:39.593278 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f0837274-ab99-4217-840d-56315121e361","Type":"ContainerDied","Data":"f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408"} Oct 03 00:26:39 crc kubenswrapper[4832]: I1003 00:26:39.593324 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9938a57c5c656c5ea5157e4aa2dbefb3da840abd39cb13c0df92d53410ed408" Oct 03 00:26:39 crc kubenswrapper[4832]: I1003 00:26:39.593678 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Oct 03 00:26:40 crc kubenswrapper[4832]: I1003 00:26:40.769594 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f0837274-ab99-4217-840d-56315121e361" (UID: "f0837274-ab99-4217-840d-56315121e361"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:40 crc kubenswrapper[4832]: I1003 00:26:40.773631 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f0837274-ab99-4217-840d-56315121e361-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.567650 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:43 crc kubenswrapper[4832]: E1003 00:26:43.568147 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="manage-dockerfile" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568159 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="manage-dockerfile" Oct 03 00:26:43 crc kubenswrapper[4832]: E1003 00:26:43.568170 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="manage-dockerfile" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568176 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="manage-dockerfile" Oct 03 00:26:43 crc kubenswrapper[4832]: E1003 00:26:43.568186 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568192 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: E1003 00:26:43.568199 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="git-clone" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568204 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="git-clone" Oct 03 00:26:43 crc kubenswrapper[4832]: E1003 00:26:43.568221 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568226 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568325 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="09dbda77-0b4d-45c5-8c13-a82c96c8b50b" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568339 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0837274-ab99-4217-840d-56315121e361" containerName="docker-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.568922 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.570807 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.570908 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.571717 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.572652 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.586513 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739139 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjnr\" (UniqueName: \"kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739203 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739247 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739291 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739327 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739364 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739393 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739478 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739514 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739568 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739632 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.739664 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841020 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841126 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841181 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841229 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841266 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841320 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841358 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841391 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841473 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841506 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841577 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjnr\" (UniqueName: \"kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841617 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841621 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841657 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.841741 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842044 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842375 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842581 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842589 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842694 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.842869 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.846130 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.846662 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.860677 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjnr\" (UniqueName: \"kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr\") pod \"smart-gateway-operator-1-build\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:43 crc kubenswrapper[4832]: I1003 00:26:43.886168 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:44 crc kubenswrapper[4832]: I1003 00:26:44.295452 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:44 crc kubenswrapper[4832]: I1003 00:26:44.653615 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerStarted","Data":"c81b337bef55594cfe88c267b978b0f622e6a22cfd2b4ac5a2d52fdd552d4139"} Oct 03 00:26:44 crc kubenswrapper[4832]: I1003 00:26:44.654134 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerStarted","Data":"c704464addf774084083a3159c2811bcae3015c4b239e800fc8c9029cb4c5e1d"} Oct 03 00:26:45 crc kubenswrapper[4832]: I1003 00:26:45.661071 4832 generic.go:334] "Generic (PLEG): container finished" podID="c299209f-5aad-4541-b24a-d08e64b0f177" containerID="c81b337bef55594cfe88c267b978b0f622e6a22cfd2b4ac5a2d52fdd552d4139" exitCode=0 Oct 03 00:26:45 crc kubenswrapper[4832]: I1003 00:26:45.661135 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerDied","Data":"c81b337bef55594cfe88c267b978b0f622e6a22cfd2b4ac5a2d52fdd552d4139"} Oct 03 00:26:45 crc kubenswrapper[4832]: I1003 00:26:45.661411 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerStarted","Data":"725e727c70b04fc9b4ad800181a1968914305eb8d5e34bf9cd2797223c5fdeb5"} Oct 03 00:26:45 crc kubenswrapper[4832]: I1003 00:26:45.684145 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.684123927 podStartE2EDuration="2.684123927s" podCreationTimestamp="2025-10-03 00:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:26:45.679221614 +0000 UTC m=+1198.941376730" watchObservedRunningTime="2025-10-03 00:26:45.684123927 +0000 UTC m=+1198.946279033" Oct 03 00:26:54 crc kubenswrapper[4832]: I1003 00:26:54.451154 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:54 crc kubenswrapper[4832]: I1003 00:26:54.452004 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="docker-build" containerID="cri-o://725e727c70b04fc9b4ad800181a1968914305eb8d5e34bf9cd2797223c5fdeb5" gracePeriod=30 Oct 03 00:26:54 crc kubenswrapper[4832]: I1003 00:26:54.711402 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_c299209f-5aad-4541-b24a-d08e64b0f177/docker-build/0.log" Oct 03 00:26:54 crc kubenswrapper[4832]: I1003 00:26:54.712039 4832 generic.go:334] "Generic (PLEG): container finished" podID="c299209f-5aad-4541-b24a-d08e64b0f177" containerID="725e727c70b04fc9b4ad800181a1968914305eb8d5e34bf9cd2797223c5fdeb5" exitCode=1 Oct 03 00:26:54 crc kubenswrapper[4832]: I1003 00:26:54.712095 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerDied","Data":"725e727c70b04fc9b4ad800181a1968914305eb8d5e34bf9cd2797223c5fdeb5"} Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.305195 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_c299209f-5aad-4541-b24a-d08e64b0f177/docker-build/0.log" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.305892 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488571 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488689 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488722 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488771 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488810 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488847 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488834 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488875 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488906 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488935 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488961 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.488993 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcjnr\" (UniqueName: \"kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489018 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs\") pod \"c299209f-5aad-4541-b24a-d08e64b0f177\" (UID: \"c299209f-5aad-4541-b24a-d08e64b0f177\") " Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489060 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489710 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489702 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489730 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c299209f-5aad-4541-b24a-d08e64b0f177-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489850 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.489992 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.490007 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.491131 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.494672 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.494732 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr" (OuterVolumeSpecName: "kube-api-access-qcjnr") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "kube-api-access-qcjnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.502985 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590628 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcjnr\" (UniqueName: \"kubernetes.io/projected/c299209f-5aad-4541-b24a-d08e64b0f177-kube-api-access-qcjnr\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590662 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590672 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590680 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590689 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c299209f-5aad-4541-b24a-d08e64b0f177-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590699 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590707 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.590716 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c299209f-5aad-4541-b24a-d08e64b0f177-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.656464 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.691607 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.718960 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_c299209f-5aad-4541-b24a-d08e64b0f177/docker-build/0.log" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.719468 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"c299209f-5aad-4541-b24a-d08e64b0f177","Type":"ContainerDied","Data":"c704464addf774084083a3159c2811bcae3015c4b239e800fc8c9029cb4c5e1d"} Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.719513 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.719520 4832 scope.go:117] "RemoveContainer" containerID="725e727c70b04fc9b4ad800181a1968914305eb8d5e34bf9cd2797223c5fdeb5" Oct 03 00:26:55 crc kubenswrapper[4832]: I1003 00:26:55.780606 4832 scope.go:117] "RemoveContainer" containerID="c81b337bef55594cfe88c267b978b0f622e6a22cfd2b4ac5a2d52fdd552d4139" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.067607 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Oct 03 00:26:56 crc kubenswrapper[4832]: E1003 00:26:56.067879 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="docker-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.067900 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="docker-build" Oct 03 00:26:56 crc kubenswrapper[4832]: E1003 00:26:56.067913 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="manage-dockerfile" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.067920 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="manage-dockerfile" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.068072 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" containerName="docker-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.069028 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.071880 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.072124 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.072604 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.080185 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096791 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096837 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096864 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096885 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096908 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbm5f\" (UniqueName: \"kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096930 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.096982 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.097003 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.097032 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.097056 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.097078 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.097103 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.128541 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c299209f-5aad-4541-b24a-d08e64b0f177" (UID: "c299209f-5aad-4541-b24a-d08e64b0f177"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198003 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198055 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198077 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198099 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198117 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbm5f\" (UniqueName: \"kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198140 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198201 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198269 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198322 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198192 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198353 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199136 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199154 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199596 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199074 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198666 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199214 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199552 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199683 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.198865 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199948 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c299209f-5aad-4541-b24a-d08e64b0f177-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.199977 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.211428 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.211570 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.216360 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbm5f\" (UniqueName: \"kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f\") pod \"smart-gateway-operator-2-build\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.352841 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.357267 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.396515 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:26:56 crc kubenswrapper[4832]: I1003 00:26:56.778022 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Oct 03 00:26:57 crc kubenswrapper[4832]: I1003 00:26:57.394232 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c299209f-5aad-4541-b24a-d08e64b0f177" path="/var/lib/kubelet/pods/c299209f-5aad-4541-b24a-d08e64b0f177/volumes" Oct 03 00:26:57 crc kubenswrapper[4832]: I1003 00:26:57.735551 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerStarted","Data":"2b4f6aad0c46048b4113bc9ed87b02f3d97b6351658c40f83351a36be8e7628e"} Oct 03 00:26:57 crc kubenswrapper[4832]: I1003 00:26:57.735604 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerStarted","Data":"18855115e62546b8fdee7e27dad7f87ccfc25e92eb74f50c0f8b75f59455bbdd"} Oct 03 00:26:57 crc kubenswrapper[4832]: E1003 00:26:57.844259 4832 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.162:48404->38.102.83.162:41165: read tcp 38.102.83.162:48404->38.102.83.162:41165: read: connection reset by peer Oct 03 00:26:58 crc kubenswrapper[4832]: I1003 00:26:58.743119 4832 generic.go:334] "Generic (PLEG): container finished" podID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerID="2b4f6aad0c46048b4113bc9ed87b02f3d97b6351658c40f83351a36be8e7628e" exitCode=0 Oct 03 00:26:58 crc kubenswrapper[4832]: I1003 00:26:58.743166 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerDied","Data":"2b4f6aad0c46048b4113bc9ed87b02f3d97b6351658c40f83351a36be8e7628e"} Oct 03 00:26:59 crc kubenswrapper[4832]: I1003 00:26:59.751525 4832 generic.go:334] "Generic (PLEG): container finished" podID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerID="4b43fe51006d00f58aa024a9ff6acfda26c9ae85bd63d56420df75bc3f00ac21" exitCode=0 Oct 03 00:26:59 crc kubenswrapper[4832]: I1003 00:26:59.751922 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerDied","Data":"4b43fe51006d00f58aa024a9ff6acfda26c9ae85bd63d56420df75bc3f00ac21"} Oct 03 00:26:59 crc kubenswrapper[4832]: I1003 00:26:59.782944 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_8fed0fb0-9fba-4012-b65c-b29651ff85ab/manage-dockerfile/0.log" Oct 03 00:27:00 crc kubenswrapper[4832]: I1003 00:27:00.769665 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerStarted","Data":"a47b40b7869cf05aebfca196d29322ecdff09a900f94624a0f6671af5bc3e243"} Oct 03 00:27:00 crc kubenswrapper[4832]: I1003 00:27:00.802736 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=4.802716667 podStartE2EDuration="4.802716667s" podCreationTimestamp="2025-10-03 00:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:27:00.802691256 +0000 UTC m=+1214.064846382" watchObservedRunningTime="2025-10-03 00:27:00.802716667 +0000 UTC m=+1214.064871783" Oct 03 00:27:26 crc kubenswrapper[4832]: I1003 00:27:26.397437 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:27:26 crc kubenswrapper[4832]: I1003 00:27:26.398047 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:27:56 crc kubenswrapper[4832]: I1003 00:27:56.396508 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:27:56 crc kubenswrapper[4832]: I1003 00:27:56.397032 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:28:09 crc kubenswrapper[4832]: I1003 00:28:09.136264 4832 generic.go:334] "Generic (PLEG): container finished" podID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerID="a47b40b7869cf05aebfca196d29322ecdff09a900f94624a0f6671af5bc3e243" exitCode=0 Oct 03 00:28:09 crc kubenswrapper[4832]: I1003 00:28:09.136316 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerDied","Data":"a47b40b7869cf05aebfca196d29322ecdff09a900f94624a0f6671af5bc3e243"} Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.392048 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588299 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588641 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588674 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588796 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588827 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588848 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588867 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588923 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588963 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588986 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbm5f\" (UniqueName: \"kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.589012 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.589028 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles\") pod \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\" (UID: \"8fed0fb0-9fba-4012-b65c-b29651ff85ab\") " Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.588551 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.589480 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.590191 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.590710 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.590900 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.593616 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.594092 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.594355 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.596195 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f" (OuterVolumeSpecName: "kube-api-access-jbm5f") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "kube-api-access-jbm5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.601742 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689907 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689946 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689961 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8fed0fb0-9fba-4012-b65c-b29651ff85ab-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689973 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689985 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.689996 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.690010 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.690022 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbm5f\" (UniqueName: \"kubernetes.io/projected/8fed0fb0-9fba-4012-b65c-b29651ff85ab-kube-api-access-jbm5f\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.690033 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8fed0fb0-9fba-4012-b65c-b29651ff85ab-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.690045 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.782413 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:10 crc kubenswrapper[4832]: I1003 00:28:10.791080 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:11 crc kubenswrapper[4832]: I1003 00:28:11.152016 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8fed0fb0-9fba-4012-b65c-b29651ff85ab","Type":"ContainerDied","Data":"18855115e62546b8fdee7e27dad7f87ccfc25e92eb74f50c0f8b75f59455bbdd"} Oct 03 00:28:11 crc kubenswrapper[4832]: I1003 00:28:11.152056 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18855115e62546b8fdee7e27dad7f87ccfc25e92eb74f50c0f8b75f59455bbdd" Oct 03 00:28:11 crc kubenswrapper[4832]: I1003 00:28:11.152076 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Oct 03 00:28:12 crc kubenswrapper[4832]: I1003 00:28:12.575536 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8fed0fb0-9fba-4012-b65c-b29651ff85ab" (UID: "8fed0fb0-9fba-4012-b65c-b29651ff85ab"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:12 crc kubenswrapper[4832]: I1003 00:28:12.613416 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8fed0fb0-9fba-4012-b65c-b29651ff85ab-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.369854 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:15 crc kubenswrapper[4832]: E1003 00:28:15.370311 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="docker-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.370324 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="docker-build" Oct 03 00:28:15 crc kubenswrapper[4832]: E1003 00:28:15.370337 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="git-clone" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.370343 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="git-clone" Oct 03 00:28:15 crc kubenswrapper[4832]: E1003 00:28:15.370356 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="manage-dockerfile" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.370364 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="manage-dockerfile" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.370469 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fed0fb0-9fba-4012-b65c-b29651ff85ab" containerName="docker-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.371050 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.373302 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.373368 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.373376 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.373524 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.393676 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545374 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545427 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545451 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545470 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545494 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545512 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545531 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545549 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545570 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545634 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh47\" (UniqueName: \"kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545656 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.545676 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646455 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646497 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646556 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646574 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646593 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646608 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646624 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646641 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646658 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646674 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646691 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.646713 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh47\" (UniqueName: \"kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647041 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647564 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647668 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647805 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647822 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.647964 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.648142 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.648325 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.648569 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.652144 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.657092 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.664828 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh47\" (UniqueName: \"kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47\") pod \"sg-core-1-build\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.685549 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Oct 03 00:28:15 crc kubenswrapper[4832]: I1003 00:28:15.871426 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:16 crc kubenswrapper[4832]: I1003 00:28:16.186817 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"204b457e-3787-45ce-9e60-49e1751edaad","Type":"ContainerStarted","Data":"8ef9c433781364d928199fd75d8f23bd218b1cef4ef7436f10f8694c540a339d"} Oct 03 00:28:17 crc kubenswrapper[4832]: I1003 00:28:17.193099 4832 generic.go:334] "Generic (PLEG): container finished" podID="204b457e-3787-45ce-9e60-49e1751edaad" containerID="f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977" exitCode=0 Oct 03 00:28:17 crc kubenswrapper[4832]: I1003 00:28:17.193184 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"204b457e-3787-45ce-9e60-49e1751edaad","Type":"ContainerDied","Data":"f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977"} Oct 03 00:28:18 crc kubenswrapper[4832]: I1003 00:28:18.201544 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"204b457e-3787-45ce-9e60-49e1751edaad","Type":"ContainerStarted","Data":"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e"} Oct 03 00:28:18 crc kubenswrapper[4832]: I1003 00:28:18.223844 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.223824327 podStartE2EDuration="3.223824327s" podCreationTimestamp="2025-10-03 00:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:28:18.221919577 +0000 UTC m=+1291.484074683" watchObservedRunningTime="2025-10-03 00:28:18.223824327 +0000 UTC m=+1291.485979433" Oct 03 00:28:25 crc kubenswrapper[4832]: I1003 00:28:25.669958 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:25 crc kubenswrapper[4832]: I1003 00:28:25.670711 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="docker-build" containerID="cri-o://fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e" gracePeriod=30 Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.055268 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_204b457e-3787-45ce-9e60-49e1751edaad/docker-build/0.log" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.055789 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.181875 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.181967 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.181994 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.182459 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.182969 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183066 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183152 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183199 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183083 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183271 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183310 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh47\" (UniqueName: \"kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183329 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183344 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183382 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183399 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache\") pod \"204b457e-3787-45ce-9e60-49e1751edaad\" (UID: \"204b457e-3787-45ce-9e60-49e1751edaad\") " Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183436 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183619 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183801 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183833 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183854 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183868 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183877 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.183995 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.184395 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.188185 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.188306 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47" (OuterVolumeSpecName: "kube-api-access-pxh47") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "kube-api-access-pxh47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.188745 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.252895 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_204b457e-3787-45ce-9e60-49e1751edaad/docker-build/0.log" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.260098 4832 generic.go:334] "Generic (PLEG): container finished" podID="204b457e-3787-45ce-9e60-49e1751edaad" containerID="fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e" exitCode=1 Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.260169 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"204b457e-3787-45ce-9e60-49e1751edaad","Type":"ContainerDied","Data":"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e"} Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.260198 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"204b457e-3787-45ce-9e60-49e1751edaad","Type":"ContainerDied","Data":"8ef9c433781364d928199fd75d8f23bd218b1cef4ef7436f10f8694c540a339d"} Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.260219 4832 scope.go:117] "RemoveContainer" containerID="fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.260221 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.284182 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285325 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285353 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh47\" (UniqueName: \"kubernetes.io/projected/204b457e-3787-45ce-9e60-49e1751edaad-kube-api-access-pxh47\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285366 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/204b457e-3787-45ce-9e60-49e1751edaad-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285378 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/204b457e-3787-45ce-9e60-49e1751edaad-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285390 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/204b457e-3787-45ce-9e60-49e1751edaad-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.285401 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.299890 4832 scope.go:117] "RemoveContainer" containerID="f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.318157 4832 scope.go:117] "RemoveContainer" containerID="fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e" Oct 03 00:28:26 crc kubenswrapper[4832]: E1003 00:28:26.319041 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e\": container with ID starting with fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e not found: ID does not exist" containerID="fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.319094 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e"} err="failed to get container status \"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e\": rpc error: code = NotFound desc = could not find container \"fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e\": container with ID starting with fd7ca9bdaa98c5662071c3d6b25627c889ee86b625a110e8448c6ac9ff34879e not found: ID does not exist" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.319125 4832 scope.go:117] "RemoveContainer" containerID="f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977" Oct 03 00:28:26 crc kubenswrapper[4832]: E1003 00:28:26.319419 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977\": container with ID starting with f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977 not found: ID does not exist" containerID="f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.319461 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977"} err="failed to get container status \"f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977\": rpc error: code = NotFound desc = could not find container \"f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977\": container with ID starting with f6a22311098a0c425e22b83c8f1866cb6c081884d7d6a437433812e18d09c977 not found: ID does not exist" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.397274 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.397528 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.397671 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.398394 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.398547 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30" gracePeriod=600 Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.588324 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "204b457e-3787-45ce-9e60-49e1751edaad" (UID: "204b457e-3787-45ce-9e60-49e1751edaad"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.689591 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/204b457e-3787-45ce-9e60-49e1751edaad-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.886806 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:26 crc kubenswrapper[4832]: I1003 00:28:26.893784 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.244357 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Oct 03 00:28:27 crc kubenswrapper[4832]: E1003 00:28:27.244916 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="docker-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.244937 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="docker-build" Oct 03 00:28:27 crc kubenswrapper[4832]: E1003 00:28:27.244955 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="manage-dockerfile" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.244963 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="manage-dockerfile" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.245109 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="204b457e-3787-45ce-9e60-49e1751edaad" containerName="docker-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.246096 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.247809 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.248148 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.248478 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.248436 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.266398 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.269357 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30" exitCode=0 Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.269459 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30"} Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.269692 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee"} Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.269785 4832 scope.go:117] "RemoveContainer" containerID="8e0c5b5dfd83bacd8cdf28933197b737cfcc5a83c84debe8eeb40692c88e27e0" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.394055 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204b457e-3787-45ce-9e60-49e1751edaad" path="/var/lib/kubelet/pods/204b457e-3787-45ce-9e60-49e1751edaad/volumes" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398593 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398632 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4zm\" (UniqueName: \"kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398653 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398679 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398813 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398835 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398881 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398903 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398928 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398960 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.398989 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.399030 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499794 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499842 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499936 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499952 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499974 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.499997 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500013 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500037 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500083 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500101 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4zm\" (UniqueName: \"kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500121 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500142 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500290 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500335 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500480 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500505 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500645 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.500791 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.501921 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.502258 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.502262 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.506263 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.506625 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.516261 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4zm\" (UniqueName: \"kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm\") pod \"sg-core-2-build\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.562411 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Oct 03 00:28:27 crc kubenswrapper[4832]: I1003 00:28:27.756116 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Oct 03 00:28:27 crc kubenswrapper[4832]: W1003 00:28:27.764073 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod647b857f_9035_4675_89e1_29a2f26cd19d.slice/crio-7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0 WatchSource:0}: Error finding container 7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0: Status 404 returned error can't find the container with id 7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0 Oct 03 00:28:28 crc kubenswrapper[4832]: I1003 00:28:28.278098 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerStarted","Data":"7dadb3ec3614200229497e6f792fe2ea442e91bef507bac748220ebdb488d086"} Oct 03 00:28:28 crc kubenswrapper[4832]: I1003 00:28:28.278391 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerStarted","Data":"7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0"} Oct 03 00:28:29 crc kubenswrapper[4832]: I1003 00:28:29.284585 4832 generic.go:334] "Generic (PLEG): container finished" podID="647b857f-9035-4675-89e1-29a2f26cd19d" containerID="7dadb3ec3614200229497e6f792fe2ea442e91bef507bac748220ebdb488d086" exitCode=0 Oct 03 00:28:29 crc kubenswrapper[4832]: I1003 00:28:29.284624 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerDied","Data":"7dadb3ec3614200229497e6f792fe2ea442e91bef507bac748220ebdb488d086"} Oct 03 00:28:30 crc kubenswrapper[4832]: I1003 00:28:30.293531 4832 generic.go:334] "Generic (PLEG): container finished" podID="647b857f-9035-4675-89e1-29a2f26cd19d" containerID="373f7d9c1cb764ca5a4bec82dd8a6270cc562fdb58d421683c0fb6d914daa535" exitCode=0 Oct 03 00:28:30 crc kubenswrapper[4832]: I1003 00:28:30.293579 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerDied","Data":"373f7d9c1cb764ca5a4bec82dd8a6270cc562fdb58d421683c0fb6d914daa535"} Oct 03 00:28:30 crc kubenswrapper[4832]: I1003 00:28:30.354281 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_647b857f-9035-4675-89e1-29a2f26cd19d/manage-dockerfile/0.log" Oct 03 00:28:31 crc kubenswrapper[4832]: I1003 00:28:31.302240 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerStarted","Data":"2e8296d563b797ef878168136476016ce4df1c7f28445a1a87f02f8d71d88e1b"} Oct 03 00:28:31 crc kubenswrapper[4832]: I1003 00:28:31.330359 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=4.330341072 podStartE2EDuration="4.330341072s" podCreationTimestamp="2025-10-03 00:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:28:31.328767712 +0000 UTC m=+1304.590922838" watchObservedRunningTime="2025-10-03 00:28:31.330341072 +0000 UTC m=+1304.592496188" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.139677 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9"] Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.142305 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.144238 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.146307 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9"] Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.146732 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.309574 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.309629 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.309693 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986z4\" (UniqueName: \"kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.411133 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.411771 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986z4\" (UniqueName: \"kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.411941 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.412268 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.424681 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.433351 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986z4\" (UniqueName: \"kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4\") pod \"collect-profiles-29324190-7hth9\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.460864 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:00 crc kubenswrapper[4832]: I1003 00:30:00.864989 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9"] Oct 03 00:30:01 crc kubenswrapper[4832]: I1003 00:30:01.828173 4832 generic.go:334] "Generic (PLEG): container finished" podID="1252e050-f2e5-4c87-8982-e8c4a43ec707" containerID="7ab009795bfbfe77ca695d9ad79a8f14d04df52c90a843c9ae470a4e8bd0013a" exitCode=0 Oct 03 00:30:01 crc kubenswrapper[4832]: I1003 00:30:01.828215 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" event={"ID":"1252e050-f2e5-4c87-8982-e8c4a43ec707","Type":"ContainerDied","Data":"7ab009795bfbfe77ca695d9ad79a8f14d04df52c90a843c9ae470a4e8bd0013a"} Oct 03 00:30:01 crc kubenswrapper[4832]: I1003 00:30:01.828240 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" event={"ID":"1252e050-f2e5-4c87-8982-e8c4a43ec707","Type":"ContainerStarted","Data":"b2739edaca68adcc48700d86078b11c1657ab044c865d8d2bc9f1bae75fa1006"} Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.040696 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.146186 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume\") pod \"1252e050-f2e5-4c87-8982-e8c4a43ec707\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.146296 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986z4\" (UniqueName: \"kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4\") pod \"1252e050-f2e5-4c87-8982-e8c4a43ec707\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.147040 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume" (OuterVolumeSpecName: "config-volume") pod "1252e050-f2e5-4c87-8982-e8c4a43ec707" (UID: "1252e050-f2e5-4c87-8982-e8c4a43ec707"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.147333 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume\") pod \"1252e050-f2e5-4c87-8982-e8c4a43ec707\" (UID: \"1252e050-f2e5-4c87-8982-e8c4a43ec707\") " Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.147619 4832 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1252e050-f2e5-4c87-8982-e8c4a43ec707-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.151459 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1252e050-f2e5-4c87-8982-e8c4a43ec707" (UID: "1252e050-f2e5-4c87-8982-e8c4a43ec707"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.151890 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4" (OuterVolumeSpecName: "kube-api-access-986z4") pod "1252e050-f2e5-4c87-8982-e8c4a43ec707" (UID: "1252e050-f2e5-4c87-8982-e8c4a43ec707"). InnerVolumeSpecName "kube-api-access-986z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.248192 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986z4\" (UniqueName: \"kubernetes.io/projected/1252e050-f2e5-4c87-8982-e8c4a43ec707-kube-api-access-986z4\") on node \"crc\" DevicePath \"\"" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.248496 4832 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1252e050-f2e5-4c87-8982-e8c4a43ec707-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.839238 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" event={"ID":"1252e050-f2e5-4c87-8982-e8c4a43ec707","Type":"ContainerDied","Data":"b2739edaca68adcc48700d86078b11c1657ab044c865d8d2bc9f1bae75fa1006"} Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.839275 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2739edaca68adcc48700d86078b11c1657ab044c865d8d2bc9f1bae75fa1006" Oct 03 00:30:03 crc kubenswrapper[4832]: I1003 00:30:03.839305 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324190-7hth9" Oct 03 00:30:26 crc kubenswrapper[4832]: I1003 00:30:26.397059 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:30:26 crc kubenswrapper[4832]: I1003 00:30:26.397607 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:30:56 crc kubenswrapper[4832]: I1003 00:30:56.396631 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:30:56 crc kubenswrapper[4832]: I1003 00:30:56.397287 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.856079 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:05 crc kubenswrapper[4832]: E1003 00:31:05.856928 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1252e050-f2e5-4c87-8982-e8c4a43ec707" containerName="collect-profiles" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.856948 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="1252e050-f2e5-4c87-8982-e8c4a43ec707" containerName="collect-profiles" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.857143 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="1252e050-f2e5-4c87-8982-e8c4a43ec707" containerName="collect-profiles" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.858533 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.863733 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.943086 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.943151 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvff6\" (UniqueName: \"kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:05 crc kubenswrapper[4832]: I1003 00:31:05.943249 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.044237 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.044331 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.044383 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvff6\" (UniqueName: \"kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.044800 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.044850 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.070795 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvff6\" (UniqueName: \"kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6\") pod \"redhat-operators-pbmj5\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.178600 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:06 crc kubenswrapper[4832]: I1003 00:31:06.366539 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:06 crc kubenswrapper[4832]: W1003 00:31:06.374337 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16acf770_b28d_402f_8494_bad005f32abb.slice/crio-3d452144e7dead4e8fc1760f1cd3a378a94d386195a2abec72e9555cecc7b9a7 WatchSource:0}: Error finding container 3d452144e7dead4e8fc1760f1cd3a378a94d386195a2abec72e9555cecc7b9a7: Status 404 returned error can't find the container with id 3d452144e7dead4e8fc1760f1cd3a378a94d386195a2abec72e9555cecc7b9a7 Oct 03 00:31:07 crc kubenswrapper[4832]: I1003 00:31:07.212451 4832 generic.go:334] "Generic (PLEG): container finished" podID="16acf770-b28d-402f-8494-bad005f32abb" containerID="93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0" exitCode=0 Oct 03 00:31:07 crc kubenswrapper[4832]: I1003 00:31:07.212553 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerDied","Data":"93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0"} Oct 03 00:31:07 crc kubenswrapper[4832]: I1003 00:31:07.212708 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerStarted","Data":"3d452144e7dead4e8fc1760f1cd3a378a94d386195a2abec72e9555cecc7b9a7"} Oct 03 00:31:07 crc kubenswrapper[4832]: I1003 00:31:07.214915 4832 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 00:31:15 crc kubenswrapper[4832]: I1003 00:31:15.257805 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerStarted","Data":"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a"} Oct 03 00:31:17 crc kubenswrapper[4832]: I1003 00:31:17.268476 4832 generic.go:334] "Generic (PLEG): container finished" podID="16acf770-b28d-402f-8494-bad005f32abb" containerID="53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a" exitCode=0 Oct 03 00:31:17 crc kubenswrapper[4832]: I1003 00:31:17.268517 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerDied","Data":"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a"} Oct 03 00:31:19 crc kubenswrapper[4832]: I1003 00:31:19.280388 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerStarted","Data":"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27"} Oct 03 00:31:21 crc kubenswrapper[4832]: I1003 00:31:21.310054 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbmj5" podStartSLOduration=4.852965178 podStartE2EDuration="16.310037505s" podCreationTimestamp="2025-10-03 00:31:05 +0000 UTC" firstStartedPulling="2025-10-03 00:31:07.214243653 +0000 UTC m=+1460.476398799" lastFinishedPulling="2025-10-03 00:31:18.67131602 +0000 UTC m=+1471.933471126" observedRunningTime="2025-10-03 00:31:21.308966131 +0000 UTC m=+1474.571121257" watchObservedRunningTime="2025-10-03 00:31:21.310037505 +0000 UTC m=+1474.572192621" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.178829 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.179391 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.220382 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.351137 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.397033 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.397098 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.397147 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.397718 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.397806 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" gracePeriod=600 Oct 03 00:31:26 crc kubenswrapper[4832]: I1003 00:31:26.452698 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:26 crc kubenswrapper[4832]: E1003 00:31:26.523618 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:31:27 crc kubenswrapper[4832]: I1003 00:31:27.321794 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" exitCode=0 Oct 03 00:31:27 crc kubenswrapper[4832]: I1003 00:31:27.321872 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee"} Oct 03 00:31:27 crc kubenswrapper[4832]: I1003 00:31:27.321944 4832 scope.go:117] "RemoveContainer" containerID="369e795d8a4b85988f84323e9f3fc20c7b786482341179ac65c2354619c74c30" Oct 03 00:31:27 crc kubenswrapper[4832]: I1003 00:31:27.322586 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:31:27 crc kubenswrapper[4832]: E1003 00:31:27.322874 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.339854 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbmj5" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="registry-server" containerID="cri-o://91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27" gracePeriod=2 Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.787946 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.936051 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities\") pod \"16acf770-b28d-402f-8494-bad005f32abb\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.936136 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvff6\" (UniqueName: \"kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6\") pod \"16acf770-b28d-402f-8494-bad005f32abb\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.936157 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content\") pod \"16acf770-b28d-402f-8494-bad005f32abb\" (UID: \"16acf770-b28d-402f-8494-bad005f32abb\") " Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.937251 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities" (OuterVolumeSpecName: "utilities") pod "16acf770-b28d-402f-8494-bad005f32abb" (UID: "16acf770-b28d-402f-8494-bad005f32abb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:28 crc kubenswrapper[4832]: I1003 00:31:28.942022 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6" (OuterVolumeSpecName: "kube-api-access-lvff6") pod "16acf770-b28d-402f-8494-bad005f32abb" (UID: "16acf770-b28d-402f-8494-bad005f32abb"). InnerVolumeSpecName "kube-api-access-lvff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.021600 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16acf770-b28d-402f-8494-bad005f32abb" (UID: "16acf770-b28d-402f-8494-bad005f32abb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.037728 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.037771 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvff6\" (UniqueName: \"kubernetes.io/projected/16acf770-b28d-402f-8494-bad005f32abb-kube-api-access-lvff6\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.037796 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16acf770-b28d-402f-8494-bad005f32abb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.347246 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerDied","Data":"2e8296d563b797ef878168136476016ce4df1c7f28445a1a87f02f8d71d88e1b"} Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.347173 4832 generic.go:334] "Generic (PLEG): container finished" podID="647b857f-9035-4675-89e1-29a2f26cd19d" containerID="2e8296d563b797ef878168136476016ce4df1c7f28445a1a87f02f8d71d88e1b" exitCode=0 Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.349544 4832 generic.go:334] "Generic (PLEG): container finished" podID="16acf770-b28d-402f-8494-bad005f32abb" containerID="91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27" exitCode=0 Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.349570 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerDied","Data":"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27"} Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.349583 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbmj5" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.349598 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbmj5" event={"ID":"16acf770-b28d-402f-8494-bad005f32abb","Type":"ContainerDied","Data":"3d452144e7dead4e8fc1760f1cd3a378a94d386195a2abec72e9555cecc7b9a7"} Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.349617 4832 scope.go:117] "RemoveContainer" containerID="91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.364053 4832 scope.go:117] "RemoveContainer" containerID="53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.386457 4832 scope.go:117] "RemoveContainer" containerID="93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.395227 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.395258 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbmj5"] Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.403440 4832 scope.go:117] "RemoveContainer" containerID="91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27" Oct 03 00:31:29 crc kubenswrapper[4832]: E1003 00:31:29.403740 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27\": container with ID starting with 91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27 not found: ID does not exist" containerID="91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.403769 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27"} err="failed to get container status \"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27\": rpc error: code = NotFound desc = could not find container \"91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27\": container with ID starting with 91d0e3ea50c2d998eb9b7b9fbd8943d03fc94ab4a02737b9ae08c7fb94d4ac27 not found: ID does not exist" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.403821 4832 scope.go:117] "RemoveContainer" containerID="53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a" Oct 03 00:31:29 crc kubenswrapper[4832]: E1003 00:31:29.404033 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a\": container with ID starting with 53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a not found: ID does not exist" containerID="53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.404056 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a"} err="failed to get container status \"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a\": rpc error: code = NotFound desc = could not find container \"53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a\": container with ID starting with 53b25165fa44c1a6bf2cb06026718cc276dd7a6b12b94a064f3559ebb8ab1e9a not found: ID does not exist" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.404069 4832 scope.go:117] "RemoveContainer" containerID="93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0" Oct 03 00:31:29 crc kubenswrapper[4832]: E1003 00:31:29.404263 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0\": container with ID starting with 93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0 not found: ID does not exist" containerID="93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0" Oct 03 00:31:29 crc kubenswrapper[4832]: I1003 00:31:29.404289 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0"} err="failed to get container status \"93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0\": rpc error: code = NotFound desc = could not find container \"93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0\": container with ID starting with 93d67656d510830f2f41c0f608cbd98162ef90cdd02b84def2739f7e9e1984e0 not found: ID does not exist" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.598499 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682321 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682378 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682403 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682454 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682496 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682528 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682547 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682600 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682624 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682652 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq4zm\" (UniqueName: \"kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682671 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682685 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets\") pod \"647b857f-9035-4675-89e1-29a2f26cd19d\" (UID: \"647b857f-9035-4675-89e1-29a2f26cd19d\") " Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.682922 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.683656 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.683706 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.684895 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.686215 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.688205 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.688343 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm" (OuterVolumeSpecName: "kube-api-access-zq4zm") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "kube-api-access-zq4zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.689203 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.692960 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.693556 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.783994 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784028 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784074 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784088 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784098 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784108 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq4zm\" (UniqueName: \"kubernetes.io/projected/647b857f-9035-4675-89e1-29a2f26cd19d-kube-api-access-zq4zm\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784117 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/647b857f-9035-4675-89e1-29a2f26cd19d-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784127 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/647b857f-9035-4675-89e1-29a2f26cd19d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784135 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.784143 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/647b857f-9035-4675-89e1-29a2f26cd19d-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:30 crc kubenswrapper[4832]: I1003 00:31:30.999374 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:31 crc kubenswrapper[4832]: I1003 00:31:31.087974 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:31 crc kubenswrapper[4832]: I1003 00:31:31.362968 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"647b857f-9035-4675-89e1-29a2f26cd19d","Type":"ContainerDied","Data":"7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0"} Oct 03 00:31:31 crc kubenswrapper[4832]: I1003 00:31:31.363006 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cac01ea47885a2143e4f81e8cee5d72de6bd842d9a38fece37d7041914529c0" Oct 03 00:31:31 crc kubenswrapper[4832]: I1003 00:31:31.363071 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Oct 03 00:31:31 crc kubenswrapper[4832]: I1003 00:31:31.393884 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16acf770-b28d-402f-8494-bad005f32abb" path="/var/lib/kubelet/pods/16acf770-b28d-402f-8494-bad005f32abb/volumes" Oct 03 00:31:33 crc kubenswrapper[4832]: I1003 00:31:33.013197 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "647b857f-9035-4675-89e1-29a2f26cd19d" (UID: "647b857f-9035-4675-89e1-29a2f26cd19d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:33 crc kubenswrapper[4832]: I1003 00:31:33.018167 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/647b857f-9035-4675-89e1-29a2f26cd19d-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.616849 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617424 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="registry-server" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617441 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="registry-server" Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617456 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="manage-dockerfile" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617464 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="manage-dockerfile" Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617485 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="extract-content" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617494 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="extract-content" Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617506 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="extract-utilities" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617514 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="extract-utilities" Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617527 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="git-clone" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617536 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="git-clone" Oct 03 00:31:35 crc kubenswrapper[4832]: E1003 00:31:35.617553 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="docker-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617562 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="docker-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617711 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="16acf770-b28d-402f-8494-bad005f32abb" containerName="registry-server" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.617727 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="647b857f-9035-4675-89e1-29a2f26cd19d" containerName="docker-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.618476 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.620663 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.620779 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.621177 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.621801 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.635951 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781102 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781155 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781188 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781215 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781234 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781256 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781278 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781299 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781319 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781339 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781356 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.781381 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvcl\" (UniqueName: \"kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.882858 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.882951 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvcl\" (UniqueName: \"kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.882988 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883023 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883097 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883129 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883154 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883186 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883220 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883255 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883286 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883314 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883409 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883640 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.883914 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.884038 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.884692 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.884855 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.884860 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.884901 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.885255 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.891356 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.896876 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.901296 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvcl\" (UniqueName: \"kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl\") pod \"sg-bridge-1-build\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:35 crc kubenswrapper[4832]: I1003 00:31:35.934391 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:36 crc kubenswrapper[4832]: I1003 00:31:36.174648 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Oct 03 00:31:36 crc kubenswrapper[4832]: I1003 00:31:36.394246 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8bd4bef0-e198-45ce-a4a3-143a7765561a","Type":"ContainerStarted","Data":"76dfc346927fba9b6b31ba6f02686cccf9844d0f6bf9ede025118a1e773f9b94"} Oct 03 00:31:37 crc kubenswrapper[4832]: I1003 00:31:37.401890 4832 generic.go:334] "Generic (PLEG): container finished" podID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerID="cce5a13c3b0bdffe025c7ccda6f7b04bbae7a6c92f96992063e2f0b6eeeda214" exitCode=0 Oct 03 00:31:37 crc kubenswrapper[4832]: I1003 00:31:37.401982 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8bd4bef0-e198-45ce-a4a3-143a7765561a","Type":"ContainerDied","Data":"cce5a13c3b0bdffe025c7ccda6f7b04bbae7a6c92f96992063e2f0b6eeeda214"} Oct 03 00:31:38 crc kubenswrapper[4832]: I1003 00:31:38.410327 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8bd4bef0-e198-45ce-a4a3-143a7765561a","Type":"ContainerStarted","Data":"0b9480d4ca64492ad1eb375d771f8da130972cd257da020961461d420ead10f2"} Oct 03 00:31:38 crc kubenswrapper[4832]: I1003 00:31:38.432218 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.432198625 podStartE2EDuration="3.432198625s" podCreationTimestamp="2025-10-03 00:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:31:38.430577023 +0000 UTC m=+1491.692732129" watchObservedRunningTime="2025-10-03 00:31:38.432198625 +0000 UTC m=+1491.694353731" Oct 03 00:31:42 crc kubenswrapper[4832]: I1003 00:31:42.385955 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:31:42 crc kubenswrapper[4832]: E1003 00:31:42.386405 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:31:44 crc kubenswrapper[4832]: I1003 00:31:44.446807 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8bd4bef0-e198-45ce-a4a3-143a7765561a/docker-build/0.log" Oct 03 00:31:44 crc kubenswrapper[4832]: I1003 00:31:44.447523 4832 generic.go:334] "Generic (PLEG): container finished" podID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerID="0b9480d4ca64492ad1eb375d771f8da130972cd257da020961461d420ead10f2" exitCode=1 Oct 03 00:31:44 crc kubenswrapper[4832]: I1003 00:31:44.447564 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8bd4bef0-e198-45ce-a4a3-143a7765561a","Type":"ContainerDied","Data":"0b9480d4ca64492ad1eb375d771f8da130972cd257da020961461d420ead10f2"} Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.673642 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8bd4bef0-e198-45ce-a4a3-143a7765561a/docker-build/0.log" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.674277 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.810853 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.810899 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.810994 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811030 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811061 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811081 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811102 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811127 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811156 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811174 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnvcl\" (UniqueName: \"kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811202 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811223 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets\") pod \"8bd4bef0-e198-45ce-a4a3-143a7765561a\" (UID: \"8bd4bef0-e198-45ce-a4a3-143a7765561a\") " Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811508 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811536 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811549 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.811663 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.812392 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.812449 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.812629 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.816571 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.816748 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl" (OuterVolumeSpecName: "kube-api-access-wnvcl") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "kube-api-access-wnvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.816821 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.881057 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912822 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912858 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912867 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnvcl\" (UniqueName: \"kubernetes.io/projected/8bd4bef0-e198-45ce-a4a3-143a7765561a-kube-api-access-wnvcl\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912877 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912885 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912893 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912903 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912911 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bd4bef0-e198-45ce-a4a3-143a7765561a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912919 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/8bd4bef0-e198-45ce-a4a3-143a7765561a-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912928 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8bd4bef0-e198-45ce-a4a3-143a7765561a-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.912937 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.929879 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Oct 03 00:31:45 crc kubenswrapper[4832]: I1003 00:31:45.933344 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Oct 03 00:31:46 crc kubenswrapper[4832]: I1003 00:31:46.146125 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8bd4bef0-e198-45ce-a4a3-143a7765561a" (UID: "8bd4bef0-e198-45ce-a4a3-143a7765561a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:31:46 crc kubenswrapper[4832]: I1003 00:31:46.216190 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8bd4bef0-e198-45ce-a4a3-143a7765561a-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:31:46 crc kubenswrapper[4832]: I1003 00:31:46.462129 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8bd4bef0-e198-45ce-a4a3-143a7765561a/docker-build/0.log" Oct 03 00:31:46 crc kubenswrapper[4832]: I1003 00:31:46.462727 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dfc346927fba9b6b31ba6f02686cccf9844d0f6bf9ede025118a1e773f9b94" Oct 03 00:31:46 crc kubenswrapper[4832]: I1003 00:31:46.462798 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.395311 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" path="/var/lib/kubelet/pods/8bd4bef0-e198-45ce-a4a3-143a7765561a/volumes" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.529711 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Oct 03 00:31:47 crc kubenswrapper[4832]: E1003 00:31:47.530654 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerName="docker-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.530675 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerName="docker-build" Oct 03 00:31:47 crc kubenswrapper[4832]: E1003 00:31:47.530704 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerName="manage-dockerfile" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.530715 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerName="manage-dockerfile" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.530902 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd4bef0-e198-45ce-a4a3-143a7765561a" containerName="docker-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.532089 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.535958 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.536165 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.536328 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.536331 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.546976 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.634737 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.634807 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.634842 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635036 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7dz\" (UniqueName: \"kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635198 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635313 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635468 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635634 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635709 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.635902 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.636003 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.636046 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.737871 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.737942 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738000 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738034 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738062 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738099 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738132 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738178 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738212 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7dz\" (UniqueName: \"kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738222 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738748 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.738787 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739212 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739306 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739407 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739724 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739822 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.739924 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.740190 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.740264 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.740369 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.744499 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.744614 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.762094 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7dz\" (UniqueName: \"kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz\") pod \"sg-bridge-2-build\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:47 crc kubenswrapper[4832]: I1003 00:31:47.853785 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Oct 03 00:31:48 crc kubenswrapper[4832]: I1003 00:31:48.074168 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Oct 03 00:31:48 crc kubenswrapper[4832]: I1003 00:31:48.477197 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerStarted","Data":"d21063c35e5b8889ae8538cb50f79d3e04fab64d6d5c56d8107144783c3217c9"} Oct 03 00:31:48 crc kubenswrapper[4832]: I1003 00:31:48.477238 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerStarted","Data":"7d0ee70d2c25e2ac2cffad5706792319b11582e637ad0d5a3d7e59f9eb0cacab"} Oct 03 00:31:48 crc kubenswrapper[4832]: E1003 00:31:48.569377 4832 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:52806->38.102.83.162:41165: write tcp 38.102.83.162:52806->38.102.83.162:41165: write: broken pipe Oct 03 00:31:49 crc kubenswrapper[4832]: I1003 00:31:49.485062 4832 generic.go:334] "Generic (PLEG): container finished" podID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerID="d21063c35e5b8889ae8538cb50f79d3e04fab64d6d5c56d8107144783c3217c9" exitCode=0 Oct 03 00:31:49 crc kubenswrapper[4832]: I1003 00:31:49.485139 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerDied","Data":"d21063c35e5b8889ae8538cb50f79d3e04fab64d6d5c56d8107144783c3217c9"} Oct 03 00:31:50 crc kubenswrapper[4832]: I1003 00:31:50.504200 4832 generic.go:334] "Generic (PLEG): container finished" podID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerID="5d51e616f7374862c0df742b675e278ebe9d4bc758f4e3d949380e6e54147aba" exitCode=0 Oct 03 00:31:50 crc kubenswrapper[4832]: I1003 00:31:50.504240 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerDied","Data":"5d51e616f7374862c0df742b675e278ebe9d4bc758f4e3d949380e6e54147aba"} Oct 03 00:31:50 crc kubenswrapper[4832]: I1003 00:31:50.539556 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_93f73a7a-92cb-47a8-bb25-813154fbcb1f/manage-dockerfile/0.log" Oct 03 00:31:51 crc kubenswrapper[4832]: I1003 00:31:51.512277 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerStarted","Data":"a468cee97bb943d2088c0ec20e694dd579f14acf35c9c605bb63fbd712f459e3"} Oct 03 00:31:51 crc kubenswrapper[4832]: I1003 00:31:51.551470 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.551428406 podStartE2EDuration="4.551428406s" podCreationTimestamp="2025-10-03 00:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:31:51.545208464 +0000 UTC m=+1504.807363570" watchObservedRunningTime="2025-10-03 00:31:51.551428406 +0000 UTC m=+1504.813583522" Oct 03 00:31:55 crc kubenswrapper[4832]: I1003 00:31:55.386405 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:31:55 crc kubenswrapper[4832]: E1003 00:31:55.386611 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.313057 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfrv7"] Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.314819 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.339554 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfrv7"] Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.389710 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:32:06 crc kubenswrapper[4832]: E1003 00:32:06.390258 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.503126 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-catalog-content\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.503243 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtqfb\" (UniqueName: \"kubernetes.io/projected/162803e4-f5f3-4b74-bc49-f78ee88caefa-kube-api-access-qtqfb\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.503394 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-utilities\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.604372 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-catalog-content\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.604432 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtqfb\" (UniqueName: \"kubernetes.io/projected/162803e4-f5f3-4b74-bc49-f78ee88caefa-kube-api-access-qtqfb\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.604480 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-utilities\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.605010 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-catalog-content\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.605043 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162803e4-f5f3-4b74-bc49-f78ee88caefa-utilities\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.621876 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtqfb\" (UniqueName: \"kubernetes.io/projected/162803e4-f5f3-4b74-bc49-f78ee88caefa-kube-api-access-qtqfb\") pod \"community-operators-lfrv7\" (UID: \"162803e4-f5f3-4b74-bc49-f78ee88caefa\") " pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.632931 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:06 crc kubenswrapper[4832]: I1003 00:32:06.901698 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfrv7"] Oct 03 00:32:07 crc kubenswrapper[4832]: I1003 00:32:07.604444 4832 generic.go:334] "Generic (PLEG): container finished" podID="162803e4-f5f3-4b74-bc49-f78ee88caefa" containerID="e5fb401b92638ba33ddf0e655da7fd6891f2428a6a19f1718f940ae4a50426dd" exitCode=0 Oct 03 00:32:07 crc kubenswrapper[4832]: I1003 00:32:07.604659 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrv7" event={"ID":"162803e4-f5f3-4b74-bc49-f78ee88caefa","Type":"ContainerDied","Data":"e5fb401b92638ba33ddf0e655da7fd6891f2428a6a19f1718f940ae4a50426dd"} Oct 03 00:32:07 crc kubenswrapper[4832]: I1003 00:32:07.604780 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrv7" event={"ID":"162803e4-f5f3-4b74-bc49-f78ee88caefa","Type":"ContainerStarted","Data":"e555f1c9de93c526c8be8838dea18dfc6634465549b42c01153beacfed8694ad"} Oct 03 00:32:11 crc kubenswrapper[4832]: I1003 00:32:11.635478 4832 generic.go:334] "Generic (PLEG): container finished" podID="162803e4-f5f3-4b74-bc49-f78ee88caefa" containerID="d308011e3e64133353864fa52b2b79e20be9bf0f2369554dcf55e69e4cf3de13" exitCode=0 Oct 03 00:32:11 crc kubenswrapper[4832]: I1003 00:32:11.635645 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrv7" event={"ID":"162803e4-f5f3-4b74-bc49-f78ee88caefa","Type":"ContainerDied","Data":"d308011e3e64133353864fa52b2b79e20be9bf0f2369554dcf55e69e4cf3de13"} Oct 03 00:32:12 crc kubenswrapper[4832]: I1003 00:32:12.642775 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrv7" event={"ID":"162803e4-f5f3-4b74-bc49-f78ee88caefa","Type":"ContainerStarted","Data":"26a15e8bb20d5f40866d811513d064d4a2902ba041290061a35d7013b87db0f4"} Oct 03 00:32:12 crc kubenswrapper[4832]: I1003 00:32:12.664010 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfrv7" podStartSLOduration=2.225787816 podStartE2EDuration="6.663968207s" podCreationTimestamp="2025-10-03 00:32:06 +0000 UTC" firstStartedPulling="2025-10-03 00:32:07.606520772 +0000 UTC m=+1520.868675868" lastFinishedPulling="2025-10-03 00:32:12.044701153 +0000 UTC m=+1525.306856259" observedRunningTime="2025-10-03 00:32:12.658863402 +0000 UTC m=+1525.921018508" watchObservedRunningTime="2025-10-03 00:32:12.663968207 +0000 UTC m=+1525.926123313" Oct 03 00:32:16 crc kubenswrapper[4832]: I1003 00:32:16.634096 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:16 crc kubenswrapper[4832]: I1003 00:32:16.634657 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:16 crc kubenswrapper[4832]: I1003 00:32:16.670539 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:19 crc kubenswrapper[4832]: I1003 00:32:19.387126 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:32:19 crc kubenswrapper[4832]: E1003 00:32:19.387679 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:26 crc kubenswrapper[4832]: I1003 00:32:26.674314 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfrv7" Oct 03 00:32:26 crc kubenswrapper[4832]: I1003 00:32:26.749155 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfrv7"] Oct 03 00:32:26 crc kubenswrapper[4832]: I1003 00:32:26.797823 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:32:26 crc kubenswrapper[4832]: I1003 00:32:26.798084 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7556" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="registry-server" containerID="cri-o://2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" gracePeriod=2 Oct 03 00:32:26 crc kubenswrapper[4832]: E1003 00:32:26.947129 4832 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 is running failed: container process not found" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 00:32:26 crc kubenswrapper[4832]: E1003 00:32:26.947597 4832 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 is running failed: container process not found" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 00:32:26 crc kubenswrapper[4832]: E1003 00:32:26.948034 4832 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 is running failed: container process not found" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 00:32:26 crc kubenswrapper[4832]: E1003 00:32:26.948114 4832 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-f7556" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="registry-server" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.144803 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.319289 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities\") pod \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.319338 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7kw2\" (UniqueName: \"kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2\") pod \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.319398 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content\") pod \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\" (UID: \"18676bf1-5bc1-452b-89a1-bdd989c9e0b1\") " Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.321600 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities" (OuterVolumeSpecName: "utilities") pod "18676bf1-5bc1-452b-89a1-bdd989c9e0b1" (UID: "18676bf1-5bc1-452b-89a1-bdd989c9e0b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.328879 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2" (OuterVolumeSpecName: "kube-api-access-g7kw2") pod "18676bf1-5bc1-452b-89a1-bdd989c9e0b1" (UID: "18676bf1-5bc1-452b-89a1-bdd989c9e0b1"). InnerVolumeSpecName "kube-api-access-g7kw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.365731 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18676bf1-5bc1-452b-89a1-bdd989c9e0b1" (UID: "18676bf1-5bc1-452b-89a1-bdd989c9e0b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.421493 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.421535 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7kw2\" (UniqueName: \"kubernetes.io/projected/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-kube-api-access-g7kw2\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.421547 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18676bf1-5bc1-452b-89a1-bdd989c9e0b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.743407 4832 generic.go:334] "Generic (PLEG): container finished" podID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" exitCode=0 Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.743446 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerDied","Data":"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0"} Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.743471 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7556" event={"ID":"18676bf1-5bc1-452b-89a1-bdd989c9e0b1","Type":"ContainerDied","Data":"f4fa208eee473803b2cf95fc0dda1a857b7d71ca176e8633a687ba18e004d6b8"} Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.743487 4832 scope.go:117] "RemoveContainer" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.743482 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7556" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.760934 4832 scope.go:117] "RemoveContainer" containerID="bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.762097 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.766949 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7556"] Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.781592 4832 scope.go:117] "RemoveContainer" containerID="2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.800543 4832 scope.go:117] "RemoveContainer" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" Oct 03 00:32:27 crc kubenswrapper[4832]: E1003 00:32:27.801003 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0\": container with ID starting with 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 not found: ID does not exist" containerID="2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.801033 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0"} err="failed to get container status \"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0\": rpc error: code = NotFound desc = could not find container \"2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0\": container with ID starting with 2c9519d841f56292c58e33fe406cecd0ba1481faa5c92289f00a690d2b9f86a0 not found: ID does not exist" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.801055 4832 scope.go:117] "RemoveContainer" containerID="bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8" Oct 03 00:32:27 crc kubenswrapper[4832]: E1003 00:32:27.801365 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8\": container with ID starting with bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8 not found: ID does not exist" containerID="bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.801397 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8"} err="failed to get container status \"bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8\": rpc error: code = NotFound desc = could not find container \"bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8\": container with ID starting with bfc0489fa52d551b769dd75de7f1244794546c28bc0cb04e1bb2f914217cc0a8 not found: ID does not exist" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.801419 4832 scope.go:117] "RemoveContainer" containerID="2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2" Oct 03 00:32:27 crc kubenswrapper[4832]: E1003 00:32:27.801710 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2\": container with ID starting with 2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2 not found: ID does not exist" containerID="2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2" Oct 03 00:32:27 crc kubenswrapper[4832]: I1003 00:32:27.801734 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2"} err="failed to get container status \"2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2\": rpc error: code = NotFound desc = could not find container \"2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2\": container with ID starting with 2e0cb541615f65ea539f88c1dfbe1f7b5e4e8fa592b250e6b9dc46307c9f3ad2 not found: ID does not exist" Oct 03 00:32:29 crc kubenswrapper[4832]: I1003 00:32:29.394257 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" path="/var/lib/kubelet/pods/18676bf1-5bc1-452b-89a1-bdd989c9e0b1/volumes" Oct 03 00:32:32 crc kubenswrapper[4832]: I1003 00:32:32.385891 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:32:32 crc kubenswrapper[4832]: E1003 00:32:32.386099 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:38 crc kubenswrapper[4832]: I1003 00:32:38.811972 4832 generic.go:334] "Generic (PLEG): container finished" podID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerID="a468cee97bb943d2088c0ec20e694dd579f14acf35c9c605bb63fbd712f459e3" exitCode=0 Oct 03 00:32:38 crc kubenswrapper[4832]: I1003 00:32:38.812038 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerDied","Data":"a468cee97bb943d2088c0ec20e694dd579f14acf35c9c605bb63fbd712f459e3"} Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.035007 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.177252 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.177314 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.177343 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.177482 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp7dz\" (UniqueName: \"kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.177518 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178486 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178838 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178886 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178913 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178982 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.178994 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179002 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179054 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179080 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179102 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles\") pod \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\" (UID: \"93f73a7a-92cb-47a8-bb25-813154fbcb1f\") " Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179369 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179391 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179403 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179862 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.179878 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.180306 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.180843 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.182802 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.182859 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz" (OuterVolumeSpecName: "kube-api-access-wp7dz") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "kube-api-access-wp7dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.182902 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280114 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280933 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280960 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280974 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/93f73a7a-92cb-47a8-bb25-813154fbcb1f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280988 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.280999 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.281010 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/93f73a7a-92cb-47a8-bb25-813154fbcb1f-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.281021 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp7dz\" (UniqueName: \"kubernetes.io/projected/93f73a7a-92cb-47a8-bb25-813154fbcb1f-kube-api-access-wp7dz\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.281032 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.824223 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"93f73a7a-92cb-47a8-bb25-813154fbcb1f","Type":"ContainerDied","Data":"7d0ee70d2c25e2ac2cffad5706792319b11582e637ad0d5a3d7e59f9eb0cacab"} Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.824266 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d0ee70d2c25e2ac2cffad5706792319b11582e637ad0d5a3d7e59f9eb0cacab" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.824276 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.858912 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "93f73a7a-92cb-47a8-bb25-813154fbcb1f" (UID: "93f73a7a-92cb-47a8-bb25-813154fbcb1f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:40 crc kubenswrapper[4832]: I1003 00:32:40.887622 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/93f73a7a-92cb-47a8-bb25-813154fbcb1f-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:43 crc kubenswrapper[4832]: I1003 00:32:43.386176 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:32:43 crc kubenswrapper[4832]: E1003 00:32:43.386525 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.608965 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609561 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="extract-utilities" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609575 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="extract-utilities" Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609585 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="registry-server" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609591 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="registry-server" Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609604 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="git-clone" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609611 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="git-clone" Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609625 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="manage-dockerfile" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609631 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="manage-dockerfile" Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609644 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="docker-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609651 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="docker-build" Oct 03 00:32:44 crc kubenswrapper[4832]: E1003 00:32:44.609676 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="extract-content" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609682 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="extract-content" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609859 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="18676bf1-5bc1-452b-89a1-bdd989c9e0b1" containerName="registry-server" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.609886 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f73a7a-92cb-47a8-bb25-813154fbcb1f" containerName="docker-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.610747 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.617846 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.618918 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.618945 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.619528 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.619656 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630797 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630837 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630861 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630911 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630941 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630967 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.630988 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.631007 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.631030 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qgw\" (UniqueName: \"kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.631062 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.631099 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.631124 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732341 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732402 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732500 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732723 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732776 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732805 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732833 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732858 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732880 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732902 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732926 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qgw\" (UniqueName: \"kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.732947 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.733038 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.733572 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.733701 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.734074 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.734960 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.735334 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.735414 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.735655 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.736318 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.741186 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.743683 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.749304 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qgw\" (UniqueName: \"kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:44 crc kubenswrapper[4832]: I1003 00:32:44.939477 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:45 crc kubenswrapper[4832]: I1003 00:32:45.326959 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:45 crc kubenswrapper[4832]: I1003 00:32:45.852064 4832 generic.go:334] "Generic (PLEG): container finished" podID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerID="8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6" exitCode=0 Oct 03 00:32:45 crc kubenswrapper[4832]: I1003 00:32:45.852127 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e1cb05b3-f936-45f0-9b0c-29a83978e9f1","Type":"ContainerDied","Data":"8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6"} Oct 03 00:32:45 crc kubenswrapper[4832]: I1003 00:32:45.852168 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e1cb05b3-f936-45f0-9b0c-29a83978e9f1","Type":"ContainerStarted","Data":"961d2c5f11a666786ae720ea914cee8c4e118399bbad8bc051172ade8d4b805f"} Oct 03 00:32:46 crc kubenswrapper[4832]: I1003 00:32:46.859014 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e1cb05b3-f936-45f0-9b0c-29a83978e9f1","Type":"ContainerStarted","Data":"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965"} Oct 03 00:32:46 crc kubenswrapper[4832]: I1003 00:32:46.888611 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=2.888578338 podStartE2EDuration="2.888578338s" podCreationTimestamp="2025-10-03 00:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:32:46.884363806 +0000 UTC m=+1560.146518922" watchObservedRunningTime="2025-10-03 00:32:46.888578338 +0000 UTC m=+1560.150733474" Oct 03 00:32:54 crc kubenswrapper[4832]: I1003 00:32:54.771026 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:54 crc kubenswrapper[4832]: I1003 00:32:54.771784 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="docker-build" containerID="cri-o://43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965" gracePeriod=30 Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.675476 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e1cb05b3-f936-45f0-9b0c-29a83978e9f1/docker-build/0.log" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.676339 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.876421 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.876852 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.876882 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.876907 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.876939 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877003 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877133 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877154 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qgw\" (UniqueName: \"kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877813 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877868 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877896 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877918 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.877961 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles\") pod \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\" (UID: \"e1cb05b3-f936-45f0-9b0c-29a83978e9f1\") " Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878247 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878227 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878347 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878568 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878745 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.878979 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.879173 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.884914 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.886928 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.892018 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw" (OuterVolumeSpecName: "kube-api-access-f6qgw") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "kube-api-access-f6qgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.910639 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e1cb05b3-f936-45f0-9b0c-29a83978e9f1/docker-build/0.log" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.911615 4832 generic.go:334] "Generic (PLEG): container finished" podID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerID="43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965" exitCode=1 Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.911662 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e1cb05b3-f936-45f0-9b0c-29a83978e9f1","Type":"ContainerDied","Data":"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965"} Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.911732 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e1cb05b3-f936-45f0-9b0c-29a83978e9f1","Type":"ContainerDied","Data":"961d2c5f11a666786ae720ea914cee8c4e118399bbad8bc051172ade8d4b805f"} Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.911770 4832 scope.go:117] "RemoveContainer" containerID="43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.911819 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.929745 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.936544 4832 scope.go:117] "RemoveContainer" containerID="8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.957228 4832 scope.go:117] "RemoveContainer" containerID="43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965" Oct 03 00:32:55 crc kubenswrapper[4832]: E1003 00:32:55.957804 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965\": container with ID starting with 43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965 not found: ID does not exist" containerID="43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.957856 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965"} err="failed to get container status \"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965\": rpc error: code = NotFound desc = could not find container \"43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965\": container with ID starting with 43d5a40ddffc4da9bf78086ffef2902373ee12deb7067dcb83af95e02d6de965 not found: ID does not exist" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.957887 4832 scope.go:117] "RemoveContainer" containerID="8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6" Oct 03 00:32:55 crc kubenswrapper[4832]: E1003 00:32:55.958318 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6\": container with ID starting with 8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6 not found: ID does not exist" containerID="8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.958367 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6"} err="failed to get container status \"8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6\": rpc error: code = NotFound desc = could not find container \"8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6\": container with ID starting with 8ec71fc3a1672b35934aa9f40e6fce8e329f82164b8e563ca6296a812830b6d6 not found: ID does not exist" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978707 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978740 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978771 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6qgw\" (UniqueName: \"kubernetes.io/projected/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-kube-api-access-f6qgw\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978782 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978791 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978799 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978808 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978817 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978825 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:55 crc kubenswrapper[4832]: I1003 00:32:55.978833 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.177623 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e1cb05b3-f936-45f0-9b0c-29a83978e9f1" (UID: "e1cb05b3-f936-45f0-9b0c-29a83978e9f1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.181469 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1cb05b3-f936-45f0-9b0c-29a83978e9f1-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.249846 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.254402 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.464973 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Oct 03 00:32:56 crc kubenswrapper[4832]: E1003 00:32:56.465270 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="manage-dockerfile" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.465288 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="manage-dockerfile" Oct 03 00:32:56 crc kubenswrapper[4832]: E1003 00:32:56.465305 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="docker-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.465314 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="docker-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.465452 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" containerName="docker-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.466487 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.468480 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.468744 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.468912 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.469296 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.486448 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.486796 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.486954 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487257 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487384 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487494 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487596 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487727 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487865 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487964 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.488058 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dgn\" (UniqueName: \"kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.488179 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.487010 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.588985 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589026 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589052 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dgn\" (UniqueName: \"kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589060 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589079 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589115 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589133 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589163 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589178 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589199 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589215 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589237 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589275 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589478 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589738 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589863 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.589953 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.590235 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.590383 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.590518 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.590935 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.592980 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.594233 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.604711 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dgn\" (UniqueName: \"kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.786225 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:32:56 crc kubenswrapper[4832]: I1003 00:32:56.964687 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Oct 03 00:32:57 crc kubenswrapper[4832]: I1003 00:32:57.389725 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:32:57 crc kubenswrapper[4832]: E1003 00:32:57.390043 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:32:57 crc kubenswrapper[4832]: I1003 00:32:57.392960 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cb05b3-f936-45f0-9b0c-29a83978e9f1" path="/var/lib/kubelet/pods/e1cb05b3-f936-45f0-9b0c-29a83978e9f1/volumes" Oct 03 00:32:57 crc kubenswrapper[4832]: I1003 00:32:57.927689 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerStarted","Data":"12dcb4297b4c29da7f542c5de4a1328f926804d7c8ed9343bc919eb4580f9a03"} Oct 03 00:32:57 crc kubenswrapper[4832]: I1003 00:32:57.928201 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerStarted","Data":"aaa1caf6d1c855b0828bd6e75197bb730a31fbbafe838d771888f394aa8bb5b8"} Oct 03 00:32:58 crc kubenswrapper[4832]: I1003 00:32:58.934122 4832 generic.go:334] "Generic (PLEG): container finished" podID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerID="12dcb4297b4c29da7f542c5de4a1328f926804d7c8ed9343bc919eb4580f9a03" exitCode=0 Oct 03 00:32:58 crc kubenswrapper[4832]: I1003 00:32:58.934174 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerDied","Data":"12dcb4297b4c29da7f542c5de4a1328f926804d7c8ed9343bc919eb4580f9a03"} Oct 03 00:32:59 crc kubenswrapper[4832]: I1003 00:32:59.944776 4832 generic.go:334] "Generic (PLEG): container finished" podID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerID="999e14ebd8ade0c3df2f15d0b87e4872a9aec20ae8a9d75fe75c523ed3923438" exitCode=0 Oct 03 00:32:59 crc kubenswrapper[4832]: I1003 00:32:59.944884 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerDied","Data":"999e14ebd8ade0c3df2f15d0b87e4872a9aec20ae8a9d75fe75c523ed3923438"} Oct 03 00:32:59 crc kubenswrapper[4832]: I1003 00:32:59.978981 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_c556531c-c24e-48b4-92d7-0eb57a6ff724/manage-dockerfile/0.log" Oct 03 00:33:00 crc kubenswrapper[4832]: I1003 00:33:00.966604 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerStarted","Data":"1b39489d401b87c4d6641a8ed7a9260a442e1d1e9261e809d589980832de91a4"} Oct 03 00:33:00 crc kubenswrapper[4832]: I1003 00:33:00.993150 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.993112323 podStartE2EDuration="4.993112323s" podCreationTimestamp="2025-10-03 00:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:33:00.989450378 +0000 UTC m=+1574.251605494" watchObservedRunningTime="2025-10-03 00:33:00.993112323 +0000 UTC m=+1574.255267429" Oct 03 00:33:09 crc kubenswrapper[4832]: I1003 00:33:09.386319 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:33:09 crc kubenswrapper[4832]: E1003 00:33:09.386871 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:33:22 crc kubenswrapper[4832]: I1003 00:33:22.385942 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:33:22 crc kubenswrapper[4832]: E1003 00:33:22.386657 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:33:37 crc kubenswrapper[4832]: I1003 00:33:37.395218 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:33:37 crc kubenswrapper[4832]: E1003 00:33:37.395848 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:33:51 crc kubenswrapper[4832]: I1003 00:33:51.386486 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:33:51 crc kubenswrapper[4832]: E1003 00:33:51.388208 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:33:55 crc kubenswrapper[4832]: I1003 00:33:55.338458 4832 generic.go:334] "Generic (PLEG): container finished" podID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerID="1b39489d401b87c4d6641a8ed7a9260a442e1d1e9261e809d589980832de91a4" exitCode=0 Oct 03 00:33:55 crc kubenswrapper[4832]: I1003 00:33:55.338525 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerDied","Data":"1b39489d401b87c4d6641a8ed7a9260a442e1d1e9261e809d589980832de91a4"} Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.575577 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766635 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766698 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766726 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766818 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766841 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766897 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766926 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dgn\" (UniqueName: \"kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766948 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766974 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.767011 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.767042 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.767068 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root\") pod \"c556531c-c24e-48b4-92d7-0eb57a6ff724\" (UID: \"c556531c-c24e-48b4-92d7-0eb57a6ff724\") " Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.766972 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.768133 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.768159 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.768295 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.768407 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.768515 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.769825 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.772518 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.772519 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn" (OuterVolumeSpecName: "kube-api-access-t6dgn") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "kube-api-access-t6dgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.773058 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.846837 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868828 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868873 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6dgn\" (UniqueName: \"kubernetes.io/projected/c556531c-c24e-48b4-92d7-0eb57a6ff724-kube-api-access-t6dgn\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868890 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868901 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868912 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868924 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868936 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/c556531c-c24e-48b4-92d7-0eb57a6ff724-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868949 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c556531c-c24e-48b4-92d7-0eb57a6ff724-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868960 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.868981 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:56 crc kubenswrapper[4832]: I1003 00:33:56.869016 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c556531c-c24e-48b4-92d7-0eb57a6ff724-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:33:57 crc kubenswrapper[4832]: I1003 00:33:57.351852 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c556531c-c24e-48b4-92d7-0eb57a6ff724","Type":"ContainerDied","Data":"aaa1caf6d1c855b0828bd6e75197bb730a31fbbafe838d771888f394aa8bb5b8"} Oct 03 00:33:57 crc kubenswrapper[4832]: I1003 00:33:57.352183 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaa1caf6d1c855b0828bd6e75197bb730a31fbbafe838d771888f394aa8bb5b8" Oct 03 00:33:57 crc kubenswrapper[4832]: I1003 00:33:57.351899 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Oct 03 00:33:57 crc kubenswrapper[4832]: I1003 00:33:57.513440 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c556531c-c24e-48b4-92d7-0eb57a6ff724" (UID: "c556531c-c24e-48b4-92d7-0eb57a6ff724"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:33:57 crc kubenswrapper[4832]: I1003 00:33:57.576260 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c556531c-c24e-48b4-92d7-0eb57a6ff724-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:04 crc kubenswrapper[4832]: I1003 00:34:04.386297 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:34:04 crc kubenswrapper[4832]: E1003 00:34:04.386807 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.772886 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Oct 03 00:34:06 crc kubenswrapper[4832]: E1003 00:34:06.773131 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="docker-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.773148 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="docker-build" Oct 03 00:34:06 crc kubenswrapper[4832]: E1003 00:34:06.773168 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="git-clone" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.773176 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="git-clone" Oct 03 00:34:06 crc kubenswrapper[4832]: E1003 00:34:06.773192 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="manage-dockerfile" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.773201 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="manage-dockerfile" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.773302 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="c556531c-c24e-48b4-92d7-0eb57a6ff724" containerName="docker-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.773925 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.775851 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.776332 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.776495 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.777817 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.792949 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.886924 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.886977 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.887006 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.887049 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.887689 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.887893 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.887953 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.888086 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbnks\" (UniqueName: \"kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.888175 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.888236 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.888289 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.888346 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990027 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbnks\" (UniqueName: \"kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990141 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990168 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990188 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990209 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990235 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990256 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990278 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990296 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990324 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990372 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990390 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.990426 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.991238 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.991514 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.991557 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.991897 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.991923 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.992012 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.992117 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:06 crc kubenswrapper[4832]: I1003 00:34:06.992196 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:07 crc kubenswrapper[4832]: I1003 00:34:07.000454 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:07 crc kubenswrapper[4832]: I1003 00:34:07.005632 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:07 crc kubenswrapper[4832]: I1003 00:34:07.009217 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbnks\" (UniqueName: \"kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:07 crc kubenswrapper[4832]: I1003 00:34:07.095982 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:07 crc kubenswrapper[4832]: I1003 00:34:07.531649 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Oct 03 00:34:08 crc kubenswrapper[4832]: I1003 00:34:08.416185 4832 generic.go:334] "Generic (PLEG): container finished" podID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerID="d6d2ee1959c061b6ba3e2568f6befcba100a2a2bc342949d6491887aeed58d36" exitCode=0 Oct 03 00:34:08 crc kubenswrapper[4832]: I1003 00:34:08.416263 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"85c60d2c-a0f6-4709-ae50-7876a76ef1fd","Type":"ContainerDied","Data":"d6d2ee1959c061b6ba3e2568f6befcba100a2a2bc342949d6491887aeed58d36"} Oct 03 00:34:08 crc kubenswrapper[4832]: I1003 00:34:08.416435 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"85c60d2c-a0f6-4709-ae50-7876a76ef1fd","Type":"ContainerStarted","Data":"23a1ad81d33dc503caf6b593cb58bf444b5e6bb98b0fafa9f377745d4d7987d8"} Oct 03 00:34:09 crc kubenswrapper[4832]: I1003 00:34:09.424679 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_85c60d2c-a0f6-4709-ae50-7876a76ef1fd/docker-build/0.log" Oct 03 00:34:09 crc kubenswrapper[4832]: I1003 00:34:09.426350 4832 generic.go:334] "Generic (PLEG): container finished" podID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerID="1d81fc5a6cca6d635409fe31566be937cfca1450fde32e3e86de414cdf57eee8" exitCode=1 Oct 03 00:34:09 crc kubenswrapper[4832]: I1003 00:34:09.426410 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"85c60d2c-a0f6-4709-ae50-7876a76ef1fd","Type":"ContainerDied","Data":"1d81fc5a6cca6d635409fe31566be937cfca1450fde32e3e86de414cdf57eee8"} Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.672074 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_85c60d2c-a0f6-4709-ae50-7876a76ef1fd/docker-build/0.log" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.673162 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844160 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844248 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844315 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844382 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844451 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844490 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844521 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844558 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844606 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844636 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844689 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.844728 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbnks\" (UniqueName: \"kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks\") pod \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\" (UID: \"85c60d2c-a0f6-4709-ae50-7876a76ef1fd\") " Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.845185 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.845229 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.845382 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.845795 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.846001 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.846032 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.846109 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.846791 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.847427 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.850764 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.851300 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.851736 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks" (OuterVolumeSpecName: "kube-api-access-qbnks") pod "85c60d2c-a0f6-4709-ae50-7876a76ef1fd" (UID: "85c60d2c-a0f6-4709-ae50-7876a76ef1fd"). InnerVolumeSpecName "kube-api-access-qbnks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.945949 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.945988 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946003 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946017 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946028 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946040 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946051 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946063 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbnks\" (UniqueName: \"kubernetes.io/projected/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-kube-api-access-qbnks\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946084 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946095 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946108 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:10 crc kubenswrapper[4832]: I1003 00:34:10.946120 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/85c60d2c-a0f6-4709-ae50-7876a76ef1fd-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:11 crc kubenswrapper[4832]: I1003 00:34:11.443087 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_85c60d2c-a0f6-4709-ae50-7876a76ef1fd/docker-build/0.log" Oct 03 00:34:11 crc kubenswrapper[4832]: I1003 00:34:11.443735 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"85c60d2c-a0f6-4709-ae50-7876a76ef1fd","Type":"ContainerDied","Data":"23a1ad81d33dc503caf6b593cb58bf444b5e6bb98b0fafa9f377745d4d7987d8"} Oct 03 00:34:11 crc kubenswrapper[4832]: I1003 00:34:11.443813 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a1ad81d33dc503caf6b593cb58bf444b5e6bb98b0fafa9f377745d4d7987d8" Oct 03 00:34:11 crc kubenswrapper[4832]: I1003 00:34:11.443781 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Oct 03 00:34:17 crc kubenswrapper[4832]: I1003 00:34:17.275773 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Oct 03 00:34:17 crc kubenswrapper[4832]: I1003 00:34:17.288707 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Oct 03 00:34:17 crc kubenswrapper[4832]: I1003 00:34:17.392398 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:34:17 crc kubenswrapper[4832]: E1003 00:34:17.392813 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:34:17 crc kubenswrapper[4832]: I1003 00:34:17.398887 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" path="/var/lib/kubelet/pods/85c60d2c-a0f6-4709-ae50-7876a76ef1fd/volumes" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.879007 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Oct 03 00:34:18 crc kubenswrapper[4832]: E1003 00:34:18.879575 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerName="docker-build" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.879591 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerName="docker-build" Oct 03 00:34:18 crc kubenswrapper[4832]: E1003 00:34:18.879627 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerName="manage-dockerfile" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.879636 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerName="manage-dockerfile" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.879776 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c60d2c-a0f6-4709-ae50-7876a76ef1fd" containerName="docker-build" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.880579 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.882991 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.883098 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.883218 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.883838 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Oct 03 00:34:18 crc kubenswrapper[4832]: I1003 00:34:18.901579 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053621 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053667 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053719 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053738 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053782 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053804 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053826 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ww5z\" (UniqueName: \"kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053884 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053909 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.053941 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.054051 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.054079 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.154946 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ww5z\" (UniqueName: \"kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.154996 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155015 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155048 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155079 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155108 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155137 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155163 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155182 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155192 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155230 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155251 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155277 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155302 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155747 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.155959 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.156196 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.156274 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.156346 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.156435 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.156462 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.164262 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.164286 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.180922 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ww5z\" (UniqueName: \"kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.212498 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.436511 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Oct 03 00:34:19 crc kubenswrapper[4832]: I1003 00:34:19.501800 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerStarted","Data":"c1f80c4bf0abce04b00c5c286fc25296ba31641b4f67d9f117128d692ff0442b"} Oct 03 00:34:20 crc kubenswrapper[4832]: I1003 00:34:20.513139 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerStarted","Data":"42e6be3d407866bf6e7f4bd938b3d45cd6f892c430f40ae1b1901fa3682fca4e"} Oct 03 00:34:21 crc kubenswrapper[4832]: I1003 00:34:21.523695 4832 generic.go:334] "Generic (PLEG): container finished" podID="393d549e-beb1-4825-b70f-22982da03efe" containerID="42e6be3d407866bf6e7f4bd938b3d45cd6f892c430f40ae1b1901fa3682fca4e" exitCode=0 Oct 03 00:34:21 crc kubenswrapper[4832]: I1003 00:34:21.523800 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerDied","Data":"42e6be3d407866bf6e7f4bd938b3d45cd6f892c430f40ae1b1901fa3682fca4e"} Oct 03 00:34:22 crc kubenswrapper[4832]: I1003 00:34:22.538949 4832 generic.go:334] "Generic (PLEG): container finished" podID="393d549e-beb1-4825-b70f-22982da03efe" containerID="3013a288ff273ad5eb614e30719426b26be56a7725dadf556ed58381b30c11a9" exitCode=0 Oct 03 00:34:22 crc kubenswrapper[4832]: I1003 00:34:22.539505 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerDied","Data":"3013a288ff273ad5eb614e30719426b26be56a7725dadf556ed58381b30c11a9"} Oct 03 00:34:22 crc kubenswrapper[4832]: I1003 00:34:22.584302 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_393d549e-beb1-4825-b70f-22982da03efe/manage-dockerfile/0.log" Oct 03 00:34:23 crc kubenswrapper[4832]: I1003 00:34:23.549874 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerStarted","Data":"3513157509b6c5791eda03f6a8f36fd871261c3b14899e4c0f2820772c8f770a"} Oct 03 00:34:23 crc kubenswrapper[4832]: I1003 00:34:23.590458 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.590416547 podStartE2EDuration="5.590416547s" podCreationTimestamp="2025-10-03 00:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:34:23.582196336 +0000 UTC m=+1656.844351432" watchObservedRunningTime="2025-10-03 00:34:23.590416547 +0000 UTC m=+1656.852571693" Oct 03 00:34:25 crc kubenswrapper[4832]: I1003 00:34:25.563280 4832 generic.go:334] "Generic (PLEG): container finished" podID="393d549e-beb1-4825-b70f-22982da03efe" containerID="3513157509b6c5791eda03f6a8f36fd871261c3b14899e4c0f2820772c8f770a" exitCode=0 Oct 03 00:34:25 crc kubenswrapper[4832]: I1003 00:34:25.563352 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerDied","Data":"3513157509b6c5791eda03f6a8f36fd871261c3b14899e4c0f2820772c8f770a"} Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.825050 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.989864 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ww5z\" (UniqueName: \"kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.989917 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.989952 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.989977 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990015 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990045 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990096 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990119 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990161 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990189 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990212 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990246 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets\") pod \"393d549e-beb1-4825-b70f-22982da03efe\" (UID: \"393d549e-beb1-4825-b70f-22982da03efe\") " Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990564 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990860 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.990918 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.991008 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.991888 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.992283 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.992388 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.992648 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.996106 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z" (OuterVolumeSpecName: "kube-api-access-9ww5z") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "kube-api-access-9ww5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.996977 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.997557 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:26 crc kubenswrapper[4832]: I1003 00:34:26.999303 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "393d549e-beb1-4825-b70f-22982da03efe" (UID: "393d549e-beb1-4825-b70f-22982da03efe"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.091521 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.091874 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ww5z\" (UniqueName: \"kubernetes.io/projected/393d549e-beb1-4825-b70f-22982da03efe-kube-api-access-9ww5z\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.091979 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092068 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092147 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/393d549e-beb1-4825-b70f-22982da03efe-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092225 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092303 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092385 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092457 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092526 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/393d549e-beb1-4825-b70f-22982da03efe-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092602 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/393d549e-beb1-4825-b70f-22982da03efe-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.092683 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/393d549e-beb1-4825-b70f-22982da03efe-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.576448 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"393d549e-beb1-4825-b70f-22982da03efe","Type":"ContainerDied","Data":"c1f80c4bf0abce04b00c5c286fc25296ba31641b4f67d9f117128d692ff0442b"} Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.576481 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f80c4bf0abce04b00c5c286fc25296ba31641b4f67d9f117128d692ff0442b" Oct 03 00:34:27 crc kubenswrapper[4832]: I1003 00:34:27.576543 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.386860 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:34:30 crc kubenswrapper[4832]: E1003 00:34:30.387094 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.504268 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Oct 03 00:34:30 crc kubenswrapper[4832]: E1003 00:34:30.504494 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="manage-dockerfile" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.504507 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="manage-dockerfile" Oct 03 00:34:30 crc kubenswrapper[4832]: E1003 00:34:30.504521 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="git-clone" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.504527 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="git-clone" Oct 03 00:34:30 crc kubenswrapper[4832]: E1003 00:34:30.504535 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="docker-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.504541 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="docker-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.504639 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="393d549e-beb1-4825-b70f-22982da03efe" containerName="docker-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.505196 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.507179 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.507370 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.507652 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.507811 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.518581 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636183 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636250 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636282 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636308 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636336 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636359 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636395 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636415 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkxg\" (UniqueName: \"kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636495 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636537 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636569 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.636601 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738048 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkxg\" (UniqueName: \"kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738125 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738173 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738206 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738242 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738270 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738301 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738327 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738359 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738358 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738386 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738468 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738487 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738561 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738566 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738700 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738859 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.738953 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.739388 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.739399 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.740036 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.742806 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.742920 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.756254 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkxg\" (UniqueName: \"kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.827946 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:30 crc kubenswrapper[4832]: I1003 00:34:30.992544 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Oct 03 00:34:31 crc kubenswrapper[4832]: I1003 00:34:31.599407 4832 generic.go:334] "Generic (PLEG): container finished" podID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerID="4555006c9fe0ac277c5d092c57890032831e69808062de2dbc38fbf6aca54357" exitCode=0 Oct 03 00:34:31 crc kubenswrapper[4832]: I1003 00:34:31.599924 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"b1d6d225-6982-420b-bf0a-a189ea77324f","Type":"ContainerDied","Data":"4555006c9fe0ac277c5d092c57890032831e69808062de2dbc38fbf6aca54357"} Oct 03 00:34:31 crc kubenswrapper[4832]: I1003 00:34:31.600863 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"b1d6d225-6982-420b-bf0a-a189ea77324f","Type":"ContainerStarted","Data":"ac95d4eccf857925410040d39162efb8b74d43e4ec08eeac5beffa27b431381c"} Oct 03 00:34:32 crc kubenswrapper[4832]: I1003 00:34:32.607523 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_b1d6d225-6982-420b-bf0a-a189ea77324f/docker-build/0.log" Oct 03 00:34:32 crc kubenswrapper[4832]: I1003 00:34:32.608040 4832 generic.go:334] "Generic (PLEG): container finished" podID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerID="d323078a056e84f35a2ed2626bee82bd9e2cc82f663ca7a83e0189cb0d4d4c6f" exitCode=1 Oct 03 00:34:32 crc kubenswrapper[4832]: I1003 00:34:32.608069 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"b1d6d225-6982-420b-bf0a-a189ea77324f","Type":"ContainerDied","Data":"d323078a056e84f35a2ed2626bee82bd9e2cc82f663ca7a83e0189cb0d4d4c6f"} Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.858887 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_b1d6d225-6982-420b-bf0a-a189ea77324f/docker-build/0.log" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.859327 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976043 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976138 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976165 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976201 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976230 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976249 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976284 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976300 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976330 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976345 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976363 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976408 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkxg\" (UniqueName: \"kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg\") pod \"b1d6d225-6982-420b-bf0a-a189ea77324f\" (UID: \"b1d6d225-6982-420b-bf0a-a189ea77324f\") " Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976530 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976576 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.976613 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.977307 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.977336 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.977412 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.978130 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.978193 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.979067 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.981588 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.981746 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:33 crc kubenswrapper[4832]: I1003 00:34:33.982998 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg" (OuterVolumeSpecName: "kube-api-access-zfkxg") pod "b1d6d225-6982-420b-bf0a-a189ea77324f" (UID: "b1d6d225-6982-420b-bf0a-a189ea77324f"). InnerVolumeSpecName "kube-api-access-zfkxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077635 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077670 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077682 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077693 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077703 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077710 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1d6d225-6982-420b-bf0a-a189ea77324f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077718 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077728 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/b1d6d225-6982-420b-bf0a-a189ea77324f-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077737 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkxg\" (UniqueName: \"kubernetes.io/projected/b1d6d225-6982-420b-bf0a-a189ea77324f-kube-api-access-zfkxg\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077745 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077767 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1d6d225-6982-420b-bf0a-a189ea77324f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.077775 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d6d225-6982-420b-bf0a-a189ea77324f-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.620974 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_b1d6d225-6982-420b-bf0a-a189ea77324f/docker-build/0.log" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.621330 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"b1d6d225-6982-420b-bf0a-a189ea77324f","Type":"ContainerDied","Data":"ac95d4eccf857925410040d39162efb8b74d43e4ec08eeac5beffa27b431381c"} Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.621360 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac95d4eccf857925410040d39162efb8b74d43e4ec08eeac5beffa27b431381c" Oct 03 00:34:34 crc kubenswrapper[4832]: I1003 00:34:34.621416 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Oct 03 00:34:40 crc kubenswrapper[4832]: I1003 00:34:40.971061 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Oct 03 00:34:40 crc kubenswrapper[4832]: I1003 00:34:40.975986 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Oct 03 00:34:41 crc kubenswrapper[4832]: I1003 00:34:41.403036 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" path="/var/lib/kubelet/pods/b1d6d225-6982-420b-bf0a-a189ea77324f/volumes" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.387986 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:34:42 crc kubenswrapper[4832]: E1003 00:34:42.388354 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.645018 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Oct 03 00:34:42 crc kubenswrapper[4832]: E1003 00:34:42.645234 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerName="docker-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.645248 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerName="docker-build" Oct 03 00:34:42 crc kubenswrapper[4832]: E1003 00:34:42.645261 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerName="manage-dockerfile" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.645267 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerName="manage-dockerfile" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.645373 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d6d225-6982-420b-bf0a-a189ea77324f" containerName="docker-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.646181 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.648570 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.648570 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.649402 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.649502 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.668347 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683416 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683472 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683495 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683519 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683551 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683601 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683622 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683643 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683685 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683705 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683727 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbg6\" (UniqueName: \"kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.683777 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.785581 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.785679 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.785721 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.785786 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.785825 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786056 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786212 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786371 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786597 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786663 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786705 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786743 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786746 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786799 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786834 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786869 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.786902 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbg6\" (UniqueName: \"kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.787461 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.787855 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.788079 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.789989 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.797048 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.797396 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.812638 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbg6\" (UniqueName: \"kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:42 crc kubenswrapper[4832]: I1003 00:34:42.982968 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:43 crc kubenswrapper[4832]: I1003 00:34:43.276739 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Oct 03 00:34:43 crc kubenswrapper[4832]: I1003 00:34:43.686859 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerStarted","Data":"1e4ec65beeed1dd4c35dd8afba729d9ceef53687fefabf49062ce03393f10a66"} Oct 03 00:34:43 crc kubenswrapper[4832]: I1003 00:34:43.688209 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerStarted","Data":"7b0de1a934c720cc672d8530661447b91086d742f1a6444065385139482407fb"} Oct 03 00:34:44 crc kubenswrapper[4832]: I1003 00:34:44.694023 4832 generic.go:334] "Generic (PLEG): container finished" podID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerID="1e4ec65beeed1dd4c35dd8afba729d9ceef53687fefabf49062ce03393f10a66" exitCode=0 Oct 03 00:34:44 crc kubenswrapper[4832]: I1003 00:34:44.694069 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerDied","Data":"1e4ec65beeed1dd4c35dd8afba729d9ceef53687fefabf49062ce03393f10a66"} Oct 03 00:34:45 crc kubenswrapper[4832]: I1003 00:34:45.711108 4832 generic.go:334] "Generic (PLEG): container finished" podID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerID="4a3e73cef620fd1dd9bb1918df8b2b58530476cac2f37b9d0901bba27ad83603" exitCode=0 Oct 03 00:34:45 crc kubenswrapper[4832]: I1003 00:34:45.711153 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerDied","Data":"4a3e73cef620fd1dd9bb1918df8b2b58530476cac2f37b9d0901bba27ad83603"} Oct 03 00:34:45 crc kubenswrapper[4832]: I1003 00:34:45.765544 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_14bc1958-b143-4ef7-ab60-c34975ecfb68/manage-dockerfile/0.log" Oct 03 00:34:46 crc kubenswrapper[4832]: I1003 00:34:46.717523 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerStarted","Data":"143b354980bf3af6211cad620f31b387b16a10732d136170b8753a52e6a239c1"} Oct 03 00:34:48 crc kubenswrapper[4832]: I1003 00:34:48.734886 4832 generic.go:334] "Generic (PLEG): container finished" podID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerID="143b354980bf3af6211cad620f31b387b16a10732d136170b8753a52e6a239c1" exitCode=0 Oct 03 00:34:48 crc kubenswrapper[4832]: I1003 00:34:48.735004 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerDied","Data":"143b354980bf3af6211cad620f31b387b16a10732d136170b8753a52e6a239c1"} Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.978920 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992561 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992639 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992667 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992702 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcbg6\" (UniqueName: \"kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992731 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992767 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992795 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992817 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992848 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992874 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992903 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.992940 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull\") pod \"14bc1958-b143-4ef7-ab60-c34975ecfb68\" (UID: \"14bc1958-b143-4ef7-ab60-c34975ecfb68\") " Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.993137 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.993458 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.993972 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.994064 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.994527 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.994739 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.994823 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.995428 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.997744 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:49 crc kubenswrapper[4832]: I1003 00:34:49.998591 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6" (OuterVolumeSpecName: "kube-api-access-gcbg6") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "kube-api-access-gcbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.000578 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.003253 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "14bc1958-b143-4ef7-ab60-c34975ecfb68" (UID: "14bc1958-b143-4ef7-ab60-c34975ecfb68"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094456 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094488 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094497 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/14bc1958-b143-4ef7-ab60-c34975ecfb68-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094506 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094514 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcbg6\" (UniqueName: \"kubernetes.io/projected/14bc1958-b143-4ef7-ab60-c34975ecfb68-kube-api-access-gcbg6\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094524 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094533 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094542 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094550 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14bc1958-b143-4ef7-ab60-c34975ecfb68-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094558 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094566 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14bc1958-b143-4ef7-ab60-c34975ecfb68-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.094576 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/14bc1958-b143-4ef7-ab60-c34975ecfb68-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.751341 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"14bc1958-b143-4ef7-ab60-c34975ecfb68","Type":"ContainerDied","Data":"7b0de1a934c720cc672d8530661447b91086d742f1a6444065385139482407fb"} Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.751417 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b0de1a934c720cc672d8530661447b91086d742f1a6444065385139482407fb" Oct 03 00:34:50 crc kubenswrapper[4832]: I1003 00:34:50.751433 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Oct 03 00:34:55 crc kubenswrapper[4832]: I1003 00:34:55.386686 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:34:55 crc kubenswrapper[4832]: E1003 00:34:55.387186 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.386911 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:35:06 crc kubenswrapper[4832]: E1003 00:35:06.387678 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.649288 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Oct 03 00:35:06 crc kubenswrapper[4832]: E1003 00:35:06.649651 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="docker-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.649685 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="docker-build" Oct 03 00:35:06 crc kubenswrapper[4832]: E1003 00:35:06.649730 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="git-clone" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.649737 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="git-clone" Oct 03 00:35:06 crc kubenswrapper[4832]: E1003 00:35:06.649784 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="manage-dockerfile" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.649793 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="manage-dockerfile" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.649937 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bc1958-b143-4ef7-ab60-c34975ecfb68" containerName="docker-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.650659 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.653681 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.656247 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.656424 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.657213 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.663502 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-bl95s" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.691801 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730436 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730524 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730577 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5lw7\" (UniqueName: \"kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730738 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730843 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730900 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.730966 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731036 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731093 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731173 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731242 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731303 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.731376 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.833889 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.834446 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.834662 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.834939 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.835161 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.835410 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.835482 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.835865 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.836230 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.835640 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.836914 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837070 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837102 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837137 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5lw7\" (UniqueName: \"kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837207 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837255 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837296 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837340 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837431 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837441 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837671 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.837714 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.845067 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.845204 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.846937 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.857405 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5lw7\" (UniqueName: \"kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:06 crc kubenswrapper[4832]: I1003 00:35:06.992300 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:07 crc kubenswrapper[4832]: I1003 00:35:07.274033 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Oct 03 00:35:07 crc kubenswrapper[4832]: I1003 00:35:07.876191 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerStarted","Data":"acf548956ac6be9760168aebea3b5b0655b6de9b0dc3c22de5fb9a04abf75ad3"} Oct 03 00:35:07 crc kubenswrapper[4832]: I1003 00:35:07.876676 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerStarted","Data":"6af5aac787d8827a010873f0cd8a35393d2a96570f5071fb1d937915eec067d3"} Oct 03 00:35:08 crc kubenswrapper[4832]: I1003 00:35:08.882579 4832 generic.go:334] "Generic (PLEG): container finished" podID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerID="acf548956ac6be9760168aebea3b5b0655b6de9b0dc3c22de5fb9a04abf75ad3" exitCode=0 Oct 03 00:35:08 crc kubenswrapper[4832]: I1003 00:35:08.882631 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerDied","Data":"acf548956ac6be9760168aebea3b5b0655b6de9b0dc3c22de5fb9a04abf75ad3"} Oct 03 00:35:09 crc kubenswrapper[4832]: I1003 00:35:09.892272 4832 generic.go:334] "Generic (PLEG): container finished" podID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerID="06c3d758a068879524456cd7af78a9071224143fd182d60a8737037a7549ef7d" exitCode=0 Oct 03 00:35:09 crc kubenswrapper[4832]: I1003 00:35:09.892320 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerDied","Data":"06c3d758a068879524456cd7af78a9071224143fd182d60a8737037a7549ef7d"} Oct 03 00:35:09 crc kubenswrapper[4832]: I1003 00:35:09.935696 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_bde9b83f-508b-4abf-8f27-1cb2cb86caed/manage-dockerfile/0.log" Oct 03 00:35:10 crc kubenswrapper[4832]: I1003 00:35:10.900795 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerStarted","Data":"602514ff10d28b0d89d2d7a8047db07ca890fc102d232bf50345767a48e9820c"} Oct 03 00:35:10 crc kubenswrapper[4832]: I1003 00:35:10.928367 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.928348203 podStartE2EDuration="4.928348203s" podCreationTimestamp="2025-10-03 00:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:35:10.925157081 +0000 UTC m=+1704.187312187" watchObservedRunningTime="2025-10-03 00:35:10.928348203 +0000 UTC m=+1704.190503309" Oct 03 00:35:19 crc kubenswrapper[4832]: I1003 00:35:19.386797 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:35:19 crc kubenswrapper[4832]: E1003 00:35:19.387558 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:32 crc kubenswrapper[4832]: I1003 00:35:32.386879 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:35:32 crc kubenswrapper[4832]: E1003 00:35:32.387805 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:42 crc kubenswrapper[4832]: I1003 00:35:42.100825 4832 generic.go:334] "Generic (PLEG): container finished" podID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerID="602514ff10d28b0d89d2d7a8047db07ca890fc102d232bf50345767a48e9820c" exitCode=0 Oct 03 00:35:42 crc kubenswrapper[4832]: I1003 00:35:42.100894 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerDied","Data":"602514ff10d28b0d89d2d7a8047db07ca890fc102d232bf50345767a48e9820c"} Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.356129 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433041 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433110 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433139 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433173 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433231 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5lw7\" (UniqueName: \"kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433250 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433323 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433355 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433373 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433399 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433440 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433478 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433543 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs\") pod \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\" (UID: \"bde9b83f-508b-4abf-8f27-1cb2cb86caed\") " Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.433834 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434330 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434346 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434539 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434647 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434775 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.434872 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.439703 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7" (OuterVolumeSpecName: "kube-api-access-t5lw7") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "kube-api-access-t5lw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.439713 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.440510 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push" (OuterVolumeSpecName: "builder-dockercfg-bl95s-push") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "builder-dockercfg-bl95s-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.440836 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull" (OuterVolumeSpecName: "builder-dockercfg-bl95s-pull") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "builder-dockercfg-bl95s-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535160 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-pull\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-pull\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535193 4832 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535202 4832 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-system-configs\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535211 4832 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535221 4832 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-bl95s-push\" (UniqueName: \"kubernetes.io/secret/bde9b83f-508b-4abf-8f27-1cb2cb86caed-builder-dockercfg-bl95s-push\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535230 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-run\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535238 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5lw7\" (UniqueName: \"kubernetes.io/projected/bde9b83f-508b-4abf-8f27-1cb2cb86caed-kube-api-access-t5lw7\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535246 4832 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildworkdir\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535255 4832 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bde9b83f-508b-4abf-8f27-1cb2cb86caed-buildcachedir\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535263 4832 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:43 crc kubenswrapper[4832]: I1003 00:35:43.535271 4832 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.114494 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bde9b83f-508b-4abf-8f27-1cb2cb86caed","Type":"ContainerDied","Data":"6af5aac787d8827a010873f0cd8a35393d2a96570f5071fb1d937915eec067d3"} Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.114817 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af5aac787d8827a010873f0cd8a35393d2a96570f5071fb1d937915eec067d3" Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.114582 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.387024 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:35:44 crc kubenswrapper[4832]: E1003 00:35:44.387405 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.466421 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:35:44 crc kubenswrapper[4832]: I1003 00:35:44.549258 4832 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-build-blob-cache\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.095541 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bde9b83f-508b-4abf-8f27-1cb2cb86caed" (UID: "bde9b83f-508b-4abf-8f27-1cb2cb86caed"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.155513 4832 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bde9b83f-508b-4abf-8f27-1cb2cb86caed-container-storage-root\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.428175 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:45 crc kubenswrapper[4832]: E1003 00:35:45.429231 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="git-clone" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.429309 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="git-clone" Oct 03 00:35:45 crc kubenswrapper[4832]: E1003 00:35:45.429384 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="docker-build" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.429445 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="docker-build" Oct 03 00:35:45 crc kubenswrapper[4832]: E1003 00:35:45.429500 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="manage-dockerfile" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.429559 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="manage-dockerfile" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.429696 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde9b83f-508b-4abf-8f27-1cb2cb86caed" containerName="docker-build" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.430171 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.432538 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-7qf9g" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.442540 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.561140 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hx55\" (UniqueName: \"kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55\") pod \"infrawatch-operators-6tx4r\" (UID: \"66db0ed3-79c9-4814-b3ad-05fe6b392b5a\") " pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.662610 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hx55\" (UniqueName: \"kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55\") pod \"infrawatch-operators-6tx4r\" (UID: \"66db0ed3-79c9-4814-b3ad-05fe6b392b5a\") " pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.686076 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hx55\" (UniqueName: \"kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55\") pod \"infrawatch-operators-6tx4r\" (UID: \"66db0ed3-79c9-4814-b3ad-05fe6b392b5a\") " pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.747837 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:45 crc kubenswrapper[4832]: I1003 00:35:45.961833 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:46 crc kubenswrapper[4832]: I1003 00:35:46.130150 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-6tx4r" event={"ID":"66db0ed3-79c9-4814-b3ad-05fe6b392b5a","Type":"ContainerStarted","Data":"db3d8928e51725670777c903b3c80c08e191f4c1431388d68f2c2d931f992ba4"} Oct 03 00:35:49 crc kubenswrapper[4832]: I1003 00:35:49.818390 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.633175 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-jcsnv"] Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.634224 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.643053 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jcsnv"] Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.738185 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqlfm\" (UniqueName: \"kubernetes.io/projected/d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab-kube-api-access-tqlfm\") pod \"infrawatch-operators-jcsnv\" (UID: \"d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab\") " pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.839485 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqlfm\" (UniqueName: \"kubernetes.io/projected/d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab-kube-api-access-tqlfm\") pod \"infrawatch-operators-jcsnv\" (UID: \"d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab\") " pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.877779 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqlfm\" (UniqueName: \"kubernetes.io/projected/d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab-kube-api-access-tqlfm\") pod \"infrawatch-operators-jcsnv\" (UID: \"d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab\") " pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:35:50 crc kubenswrapper[4832]: I1003 00:35:50.955512 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:35:55 crc kubenswrapper[4832]: I1003 00:35:55.387417 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:35:55 crc kubenswrapper[4832]: E1003 00:35:55.388176 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:35:55 crc kubenswrapper[4832]: I1003 00:35:55.774761 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jcsnv"] Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.197792 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jcsnv" event={"ID":"d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab","Type":"ContainerStarted","Data":"7d70836f51e60ae9ad0bb57b1548530e1b2345e9be9ab6e1d761182de68eb2db"} Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.197827 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jcsnv" event={"ID":"d6a63b6d-21ab-4ec9-8d5f-f8b7cf6036ab","Type":"ContainerStarted","Data":"4d3f068741e73aa83b72ab843093c4e47c5a8522a961772d583ebd70e7c00a45"} Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.202922 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-6tx4r" event={"ID":"66db0ed3-79c9-4814-b3ad-05fe6b392b5a","Type":"ContainerStarted","Data":"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0"} Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.203040 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-6tx4r" podUID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" containerName="registry-server" containerID="cri-o://5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0" gracePeriod=2 Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.234847 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-jcsnv" podStartSLOduration=6.162683073 podStartE2EDuration="6.234825269s" podCreationTimestamp="2025-10-03 00:35:50 +0000 UTC" firstStartedPulling="2025-10-03 00:35:55.888331878 +0000 UTC m=+1749.150486984" lastFinishedPulling="2025-10-03 00:35:55.960474064 +0000 UTC m=+1749.222629180" observedRunningTime="2025-10-03 00:35:56.218355985 +0000 UTC m=+1749.480511081" watchObservedRunningTime="2025-10-03 00:35:56.234825269 +0000 UTC m=+1749.496980375" Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.236835 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-6tx4r" podStartSLOduration=1.232700401 podStartE2EDuration="11.236820322s" podCreationTimestamp="2025-10-03 00:35:45 +0000 UTC" firstStartedPulling="2025-10-03 00:35:45.966975221 +0000 UTC m=+1739.229130327" lastFinishedPulling="2025-10-03 00:35:55.971095142 +0000 UTC m=+1749.233250248" observedRunningTime="2025-10-03 00:35:56.233333641 +0000 UTC m=+1749.495488777" watchObservedRunningTime="2025-10-03 00:35:56.236820322 +0000 UTC m=+1749.498975438" Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.525013 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.611503 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hx55\" (UniqueName: \"kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55\") pod \"66db0ed3-79c9-4814-b3ad-05fe6b392b5a\" (UID: \"66db0ed3-79c9-4814-b3ad-05fe6b392b5a\") " Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.618786 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55" (OuterVolumeSpecName: "kube-api-access-6hx55") pod "66db0ed3-79c9-4814-b3ad-05fe6b392b5a" (UID: "66db0ed3-79c9-4814-b3ad-05fe6b392b5a"). InnerVolumeSpecName "kube-api-access-6hx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:35:56 crc kubenswrapper[4832]: I1003 00:35:56.713915 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hx55\" (UniqueName: \"kubernetes.io/projected/66db0ed3-79c9-4814-b3ad-05fe6b392b5a-kube-api-access-6hx55\") on node \"crc\" DevicePath \"\"" Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.210444 4832 generic.go:334] "Generic (PLEG): container finished" podID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" containerID="5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0" exitCode=0 Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.210502 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-6tx4r" event={"ID":"66db0ed3-79c9-4814-b3ad-05fe6b392b5a","Type":"ContainerDied","Data":"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0"} Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.210574 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-6tx4r" event={"ID":"66db0ed3-79c9-4814-b3ad-05fe6b392b5a","Type":"ContainerDied","Data":"db3d8928e51725670777c903b3c80c08e191f4c1431388d68f2c2d931f992ba4"} Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.210607 4832 scope.go:117] "RemoveContainer" containerID="5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0" Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.210531 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-6tx4r" Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.231038 4832 scope.go:117] "RemoveContainer" containerID="5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0" Oct 03 00:35:57 crc kubenswrapper[4832]: E1003 00:35:57.232058 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0\": container with ID starting with 5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0 not found: ID does not exist" containerID="5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0" Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.232102 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0"} err="failed to get container status \"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0\": rpc error: code = NotFound desc = could not find container \"5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0\": container with ID starting with 5605fbd72b81f9f9cc03ceea3189d75349e6577d6a8e3b720e372bba94bc22b0 not found: ID does not exist" Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.242000 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.246055 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-6tx4r"] Oct 03 00:35:57 crc kubenswrapper[4832]: I1003 00:35:57.402064 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" path="/var/lib/kubelet/pods/66db0ed3-79c9-4814-b3ad-05fe6b392b5a/volumes" Oct 03 00:36:00 crc kubenswrapper[4832]: I1003 00:36:00.956291 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:36:00 crc kubenswrapper[4832]: I1003 00:36:00.956642 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:36:00 crc kubenswrapper[4832]: I1003 00:36:00.987170 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:36:01 crc kubenswrapper[4832]: I1003 00:36:01.298244 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-jcsnv" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.860227 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56"] Oct 03 00:36:04 crc kubenswrapper[4832]: E1003 00:36:04.860726 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" containerName="registry-server" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.860739 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" containerName="registry-server" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.860865 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db0ed3-79c9-4814-b3ad-05fe6b392b5a" containerName="registry-server" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.861588 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.872030 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56"] Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.910491 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4sz\" (UniqueName: \"kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.910549 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:04 crc kubenswrapper[4832]: I1003 00:36:04.910776 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.011968 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.012029 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.012088 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4sz\" (UniqueName: \"kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.012455 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.012950 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.028787 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4sz\" (UniqueName: \"kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.186732 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.351921 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56"] Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.671088 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5"] Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.672416 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.674564 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5"] Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.722191 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.722240 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.722301 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6z4w\" (UniqueName: \"kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.823384 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6z4w\" (UniqueName: \"kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.823494 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.823520 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.824039 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.824127 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:05 crc kubenswrapper[4832]: I1003 00:36:05.842150 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6z4w\" (UniqueName: \"kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.016799 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.185272 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5"] Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.274593 4832 generic.go:334] "Generic (PLEG): container finished" podID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerID="8b4ec0a9cafc8aecdf184821ec6fe8d8faaefecaf4b4cbc7d364b2f40b949344" exitCode=0 Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.274671 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" event={"ID":"f01476df-bcc4-4cb4-a893-b52c606aa39b","Type":"ContainerDied","Data":"8b4ec0a9cafc8aecdf184821ec6fe8d8faaefecaf4b4cbc7d364b2f40b949344"} Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.274705 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" event={"ID":"f01476df-bcc4-4cb4-a893-b52c606aa39b","Type":"ContainerStarted","Data":"6fd3a4760ada220223b92b9f6ad086c38c79df90bed65ef673f14b09f89ba831"} Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.276251 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" event={"ID":"00ca2adc-16bd-4114-8f0d-5fbf1518d21e","Type":"ContainerStarted","Data":"1a814fc234c77eb843865b8eca20be1f263af3ed1fb82e5677ded0746c9893f5"} Oct 03 00:36:06 crc kubenswrapper[4832]: I1003 00:36:06.386350 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:36:06 crc kubenswrapper[4832]: E1003 00:36:06.388069 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:36:07 crc kubenswrapper[4832]: I1003 00:36:07.284845 4832 generic.go:334] "Generic (PLEG): container finished" podID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerID="33541fb6884922c416bfa26627afb238b37dcadee4d90fdef7754c1bdb042f06" exitCode=0 Oct 03 00:36:07 crc kubenswrapper[4832]: I1003 00:36:07.284922 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" event={"ID":"00ca2adc-16bd-4114-8f0d-5fbf1518d21e","Type":"ContainerDied","Data":"33541fb6884922c416bfa26627afb238b37dcadee4d90fdef7754c1bdb042f06"} Oct 03 00:36:07 crc kubenswrapper[4832]: I1003 00:36:07.287601 4832 generic.go:334] "Generic (PLEG): container finished" podID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerID="176139a4694aaad0c250c5d5dab2935021ca674ae15331a4042e24de0fd33bda" exitCode=0 Oct 03 00:36:07 crc kubenswrapper[4832]: I1003 00:36:07.287640 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" event={"ID":"f01476df-bcc4-4cb4-a893-b52c606aa39b","Type":"ContainerDied","Data":"176139a4694aaad0c250c5d5dab2935021ca674ae15331a4042e24de0fd33bda"} Oct 03 00:36:07 crc kubenswrapper[4832]: I1003 00:36:07.288090 4832 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 00:36:08 crc kubenswrapper[4832]: I1003 00:36:08.295952 4832 generic.go:334] "Generic (PLEG): container finished" podID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerID="146e36b4bc218d0b9b438f6292ddb5dd9580352b6370f794cf92e3acb8fffd26" exitCode=0 Oct 03 00:36:08 crc kubenswrapper[4832]: I1003 00:36:08.296043 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" event={"ID":"f01476df-bcc4-4cb4-a893-b52c606aa39b","Type":"ContainerDied","Data":"146e36b4bc218d0b9b438f6292ddb5dd9580352b6370f794cf92e3acb8fffd26"} Oct 03 00:36:08 crc kubenswrapper[4832]: I1003 00:36:08.299900 4832 generic.go:334] "Generic (PLEG): container finished" podID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerID="c02e46920c80493428d083274ce61ec84aa363a3b89c7fa183b9e0e5e96cef20" exitCode=0 Oct 03 00:36:08 crc kubenswrapper[4832]: I1003 00:36:08.299963 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" event={"ID":"00ca2adc-16bd-4114-8f0d-5fbf1518d21e","Type":"ContainerDied","Data":"c02e46920c80493428d083274ce61ec84aa363a3b89c7fa183b9e0e5e96cef20"} Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.306793 4832 generic.go:334] "Generic (PLEG): container finished" podID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerID="d1170c97dc3c291d19742ae5b977c940a067246435f42be7ab2cbfedc7a863c5" exitCode=0 Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.306888 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" event={"ID":"00ca2adc-16bd-4114-8f0d-5fbf1518d21e","Type":"ContainerDied","Data":"d1170c97dc3c291d19742ae5b977c940a067246435f42be7ab2cbfedc7a863c5"} Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.522543 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.570536 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle\") pod \"f01476df-bcc4-4cb4-a893-b52c606aa39b\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.570617 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w4sz\" (UniqueName: \"kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz\") pod \"f01476df-bcc4-4cb4-a893-b52c606aa39b\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.570689 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util\") pod \"f01476df-bcc4-4cb4-a893-b52c606aa39b\" (UID: \"f01476df-bcc4-4cb4-a893-b52c606aa39b\") " Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.571185 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle" (OuterVolumeSpecName: "bundle") pod "f01476df-bcc4-4cb4-a893-b52c606aa39b" (UID: "f01476df-bcc4-4cb4-a893-b52c606aa39b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.576926 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz" (OuterVolumeSpecName: "kube-api-access-8w4sz") pod "f01476df-bcc4-4cb4-a893-b52c606aa39b" (UID: "f01476df-bcc4-4cb4-a893-b52c606aa39b"). InnerVolumeSpecName "kube-api-access-8w4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.587061 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util" (OuterVolumeSpecName: "util") pod "f01476df-bcc4-4cb4-a893-b52c606aa39b" (UID: "f01476df-bcc4-4cb4-a893-b52c606aa39b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.672883 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.672935 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f01476df-bcc4-4cb4-a893-b52c606aa39b-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:09 crc kubenswrapper[4832]: I1003 00:36:09.672949 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w4sz\" (UniqueName: \"kubernetes.io/projected/f01476df-bcc4-4cb4-a893-b52c606aa39b-kube-api-access-8w4sz\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.316048 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.316130 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c0995k56" event={"ID":"f01476df-bcc4-4cb4-a893-b52c606aa39b","Type":"ContainerDied","Data":"6fd3a4760ada220223b92b9f6ad086c38c79df90bed65ef673f14b09f89ba831"} Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.316164 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd3a4760ada220223b92b9f6ad086c38c79df90bed65ef673f14b09f89ba831" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.508109 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.587746 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util\") pod \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.588176 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle\") pod \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.588293 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6z4w\" (UniqueName: \"kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w\") pod \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\" (UID: \"00ca2adc-16bd-4114-8f0d-5fbf1518d21e\") " Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.588716 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle" (OuterVolumeSpecName: "bundle") pod "00ca2adc-16bd-4114-8f0d-5fbf1518d21e" (UID: "00ca2adc-16bd-4114-8f0d-5fbf1518d21e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.593870 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w" (OuterVolumeSpecName: "kube-api-access-d6z4w") pod "00ca2adc-16bd-4114-8f0d-5fbf1518d21e" (UID: "00ca2adc-16bd-4114-8f0d-5fbf1518d21e"). InnerVolumeSpecName "kube-api-access-d6z4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.612414 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util" (OuterVolumeSpecName: "util") pod "00ca2adc-16bd-4114-8f0d-5fbf1518d21e" (UID: "00ca2adc-16bd-4114-8f0d-5fbf1518d21e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.690022 4832 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.690054 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6z4w\" (UniqueName: \"kubernetes.io/projected/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-kube-api-access-d6z4w\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:10 crc kubenswrapper[4832]: I1003 00:36:10.690065 4832 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00ca2adc-16bd-4114-8f0d-5fbf1518d21e-util\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:11 crc kubenswrapper[4832]: I1003 00:36:11.323383 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" event={"ID":"00ca2adc-16bd-4114-8f0d-5fbf1518d21e","Type":"ContainerDied","Data":"1a814fc234c77eb843865b8eca20be1f263af3ed1fb82e5677ded0746c9893f5"} Oct 03 00:36:11 crc kubenswrapper[4832]: I1003 00:36:11.323685 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a814fc234c77eb843865b8eca20be1f263af3ed1fb82e5677ded0746c9893f5" Oct 03 00:36:11 crc kubenswrapper[4832]: I1003 00:36:11.323439 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a47cv5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.536369 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5"] Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537105 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537120 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537134 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="util" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537141 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="util" Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537150 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="pull" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537155 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="pull" Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537169 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="pull" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537175 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="pull" Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537185 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537191 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: E1003 00:36:16.537204 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="util" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537210 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="util" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537304 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01476df-bcc4-4cb4-a893-b52c606aa39b" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537319 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ca2adc-16bd-4114-8f0d-5fbf1518d21e" containerName="extract" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.537724 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.558112 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5"] Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.559866 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-7lxz9" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.566230 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/bafba4cc-4e6f-47fd-be34-4547031375e2-runner\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.566502 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfh8m\" (UniqueName: \"kubernetes.io/projected/bafba4cc-4e6f-47fd-be34-4547031375e2-kube-api-access-qfh8m\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.667729 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/bafba4cc-4e6f-47fd-be34-4547031375e2-runner\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.667823 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfh8m\" (UniqueName: \"kubernetes.io/projected/bafba4cc-4e6f-47fd-be34-4547031375e2-kube-api-access-qfh8m\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.668872 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/bafba4cc-4e6f-47fd-be34-4547031375e2-runner\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.689748 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfh8m\" (UniqueName: \"kubernetes.io/projected/bafba4cc-4e6f-47fd-be34-4547031375e2-kube-api-access-qfh8m\") pod \"service-telemetry-operator-7d6c88b75b-sjdm5\" (UID: \"bafba4cc-4e6f-47fd-be34-4547031375e2\") " pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:16 crc kubenswrapper[4832]: I1003 00:36:16.854063 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" Oct 03 00:36:17 crc kubenswrapper[4832]: I1003 00:36:17.385468 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5"] Oct 03 00:36:18 crc kubenswrapper[4832]: I1003 00:36:18.369389 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" event={"ID":"bafba4cc-4e6f-47fd-be34-4547031375e2","Type":"ContainerStarted","Data":"b6233ee799b8c8d6f09e28873e5c423dea927801a3c25f1e145218c7afe3990f"} Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.356097 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh"] Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.360982 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.363169 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-lp2nq" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.371254 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh"] Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.406247 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7eb70cec-c2c0-46a3-9156-8f652211e548-runner\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.406530 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z79sf\" (UniqueName: \"kubernetes.io/projected/7eb70cec-c2c0-46a3-9156-8f652211e548-kube-api-access-z79sf\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.507636 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7eb70cec-c2c0-46a3-9156-8f652211e548-runner\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.507692 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z79sf\" (UniqueName: \"kubernetes.io/projected/7eb70cec-c2c0-46a3-9156-8f652211e548-kube-api-access-z79sf\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.508977 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7eb70cec-c2c0-46a3-9156-8f652211e548-runner\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.539129 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z79sf\" (UniqueName: \"kubernetes.io/projected/7eb70cec-c2c0-46a3-9156-8f652211e548-kube-api-access-z79sf\") pod \"smart-gateway-operator-547fdfd8bd-xdfmh\" (UID: \"7eb70cec-c2c0-46a3-9156-8f652211e548\") " pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:19 crc kubenswrapper[4832]: I1003 00:36:19.689173 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" Oct 03 00:36:20 crc kubenswrapper[4832]: W1003 00:36:20.077558 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb70cec_c2c0_46a3_9156_8f652211e548.slice/crio-0a5d648898dba7b38f9460812e4b6a0e15613c26fe2c0cb56ba5eaf8168afb31 WatchSource:0}: Error finding container 0a5d648898dba7b38f9460812e4b6a0e15613c26fe2c0cb56ba5eaf8168afb31: Status 404 returned error can't find the container with id 0a5d648898dba7b38f9460812e4b6a0e15613c26fe2c0cb56ba5eaf8168afb31 Oct 03 00:36:20 crc kubenswrapper[4832]: I1003 00:36:20.082306 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh"] Oct 03 00:36:20 crc kubenswrapper[4832]: I1003 00:36:20.386447 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:36:20 crc kubenswrapper[4832]: E1003 00:36:20.386979 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:36:20 crc kubenswrapper[4832]: I1003 00:36:20.388653 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" event={"ID":"7eb70cec-c2c0-46a3-9156-8f652211e548","Type":"ContainerStarted","Data":"0a5d648898dba7b38f9460812e4b6a0e15613c26fe2c0cb56ba5eaf8168afb31"} Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.827672 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.829671 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.853106 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.897633 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.897724 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.897791 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncknd\" (UniqueName: \"kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.999021 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncknd\" (UniqueName: \"kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.999125 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.999150 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.999575 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:31 crc kubenswrapper[4832]: I1003 00:36:31.999875 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:32 crc kubenswrapper[4832]: I1003 00:36:32.021972 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncknd\" (UniqueName: \"kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd\") pod \"certified-operators-wmx4k\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:32 crc kubenswrapper[4832]: I1003 00:36:32.154351 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:34 crc kubenswrapper[4832]: E1003 00:36:34.168041 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/service-telemetry-operator:latest" Oct 03 00:36:34 crc kubenswrapper[4832]: E1003 00:36:34.169173 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/service-telemetry-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:service-telemetry-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_WEBHOOK_SNMP_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/prometheus-webhook-snmp:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_IMAGE,Value:quay.io/prometheus/prometheus:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER_IMAGE,Value:quay.io/prometheus/alertmanager:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:service-telemetry-operator.v1.5.1759451640,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfh8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-operator-7d6c88b75b-sjdm5_service-telemetry(bafba4cc-4e6f-47fd-be34-4547031375e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 00:36:34 crc kubenswrapper[4832]: E1003 00:36:34.170450 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" podUID="bafba4cc-4e6f-47fd-be34-4547031375e2" Oct 03 00:36:34 crc kubenswrapper[4832]: I1003 00:36:34.238596 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:34 crc kubenswrapper[4832]: I1003 00:36:34.386983 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:36:34 crc kubenswrapper[4832]: E1003 00:36:34.547660 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/service-telemetry-operator:latest\\\"\"" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" podUID="bafba4cc-4e6f-47fd-be34-4547031375e2" Oct 03 00:36:36 crc kubenswrapper[4832]: I1003 00:36:36.565608 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerStarted","Data":"e13db079b052f9c7f1bff603487e5321fb2bb58bc0760a8e59e627e27b5ad0b5"} Oct 03 00:36:37 crc kubenswrapper[4832]: I1003 00:36:37.577859 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896"} Oct 03 00:36:37 crc kubenswrapper[4832]: I1003 00:36:37.581553 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" event={"ID":"7eb70cec-c2c0-46a3-9156-8f652211e548","Type":"ContainerStarted","Data":"ec65144bfa1cf020bf317aed65ea5c669dff5b4b17d00c755d7e049f90b2bbac"} Oct 03 00:36:37 crc kubenswrapper[4832]: I1003 00:36:37.584486 4832 generic.go:334] "Generic (PLEG): container finished" podID="f50b9899-2b27-49c9-bd77-198464717c52" containerID="bf033e3a59702e0f7b9797152adc596c468f9d1c9129b8c305e037c87e5f7bf9" exitCode=0 Oct 03 00:36:37 crc kubenswrapper[4832]: I1003 00:36:37.584561 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerDied","Data":"bf033e3a59702e0f7b9797152adc596c468f9d1c9129b8c305e037c87e5f7bf9"} Oct 03 00:36:37 crc kubenswrapper[4832]: I1003 00:36:37.643788 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-547fdfd8bd-xdfmh" podStartSLOduration=2.148825653 podStartE2EDuration="18.643767787s" podCreationTimestamp="2025-10-03 00:36:19 +0000 UTC" firstStartedPulling="2025-10-03 00:36:20.0792643 +0000 UTC m=+1773.341419406" lastFinishedPulling="2025-10-03 00:36:36.574206434 +0000 UTC m=+1789.836361540" observedRunningTime="2025-10-03 00:36:37.637231958 +0000 UTC m=+1790.899387064" watchObservedRunningTime="2025-10-03 00:36:37.643767787 +0000 UTC m=+1790.905922893" Oct 03 00:36:38 crc kubenswrapper[4832]: I1003 00:36:38.595083 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerStarted","Data":"1576adf6c80386c0c09416666e8ae1771ba6247eaa489f5aa7aa8d5ec411e91d"} Oct 03 00:36:39 crc kubenswrapper[4832]: I1003 00:36:39.606236 4832 generic.go:334] "Generic (PLEG): container finished" podID="f50b9899-2b27-49c9-bd77-198464717c52" containerID="1576adf6c80386c0c09416666e8ae1771ba6247eaa489f5aa7aa8d5ec411e91d" exitCode=0 Oct 03 00:36:39 crc kubenswrapper[4832]: I1003 00:36:39.606444 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerDied","Data":"1576adf6c80386c0c09416666e8ae1771ba6247eaa489f5aa7aa8d5ec411e91d"} Oct 03 00:36:40 crc kubenswrapper[4832]: I1003 00:36:40.614357 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerStarted","Data":"f4f05aa9fcb42f8b2cbf2180a8c7593b82e705585518a6294457fe5aeaecfdd8"} Oct 03 00:36:40 crc kubenswrapper[4832]: I1003 00:36:40.632116 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wmx4k" podStartSLOduration=7.027877415 podStartE2EDuration="9.632098289s" podCreationTimestamp="2025-10-03 00:36:31 +0000 UTC" firstStartedPulling="2025-10-03 00:36:37.588336971 +0000 UTC m=+1790.850492117" lastFinishedPulling="2025-10-03 00:36:40.192557845 +0000 UTC m=+1793.454712991" observedRunningTime="2025-10-03 00:36:40.631457128 +0000 UTC m=+1793.893612234" watchObservedRunningTime="2025-10-03 00:36:40.632098289 +0000 UTC m=+1793.894253395" Oct 03 00:36:42 crc kubenswrapper[4832]: I1003 00:36:42.154859 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:42 crc kubenswrapper[4832]: I1003 00:36:42.154943 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:43 crc kubenswrapper[4832]: I1003 00:36:43.196399 4832 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-wmx4k" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="registry-server" probeResult="failure" output=< Oct 03 00:36:43 crc kubenswrapper[4832]: timeout: failed to connect service ":50051" within 1s Oct 03 00:36:43 crc kubenswrapper[4832]: > Oct 03 00:36:50 crc kubenswrapper[4832]: I1003 00:36:50.679334 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" event={"ID":"bafba4cc-4e6f-47fd-be34-4547031375e2","Type":"ContainerStarted","Data":"46f3e3a6e0a7e57516b02baafe3097ba59f5c4ad9b513a7dfd316b744f04272c"} Oct 03 00:36:50 crc kubenswrapper[4832]: I1003 00:36:50.696394 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-7d6c88b75b-sjdm5" podStartSLOduration=2.204922344 podStartE2EDuration="34.696375655s" podCreationTimestamp="2025-10-03 00:36:16 +0000 UTC" firstStartedPulling="2025-10-03 00:36:17.402390423 +0000 UTC m=+1770.664545529" lastFinishedPulling="2025-10-03 00:36:49.893843734 +0000 UTC m=+1803.155998840" observedRunningTime="2025-10-03 00:36:50.694184995 +0000 UTC m=+1803.956340111" watchObservedRunningTime="2025-10-03 00:36:50.696375655 +0000 UTC m=+1803.958530771" Oct 03 00:36:52 crc kubenswrapper[4832]: I1003 00:36:52.218494 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:52 crc kubenswrapper[4832]: I1003 00:36:52.267572 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:55 crc kubenswrapper[4832]: I1003 00:36:55.021107 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:55 crc kubenswrapper[4832]: I1003 00:36:55.021388 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wmx4k" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="registry-server" containerID="cri-o://f4f05aa9fcb42f8b2cbf2180a8c7593b82e705585518a6294457fe5aeaecfdd8" gracePeriod=2 Oct 03 00:36:55 crc kubenswrapper[4832]: I1003 00:36:55.718496 4832 generic.go:334] "Generic (PLEG): container finished" podID="f50b9899-2b27-49c9-bd77-198464717c52" containerID="f4f05aa9fcb42f8b2cbf2180a8c7593b82e705585518a6294457fe5aeaecfdd8" exitCode=0 Oct 03 00:36:55 crc kubenswrapper[4832]: I1003 00:36:55.718568 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerDied","Data":"f4f05aa9fcb42f8b2cbf2180a8c7593b82e705585518a6294457fe5aeaecfdd8"} Oct 03 00:36:55 crc kubenswrapper[4832]: I1003 00:36:55.907190 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.031084 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncknd\" (UniqueName: \"kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd\") pod \"f50b9899-2b27-49c9-bd77-198464717c52\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.031135 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content\") pod \"f50b9899-2b27-49c9-bd77-198464717c52\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.031232 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities\") pod \"f50b9899-2b27-49c9-bd77-198464717c52\" (UID: \"f50b9899-2b27-49c9-bd77-198464717c52\") " Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.032390 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities" (OuterVolumeSpecName: "utilities") pod "f50b9899-2b27-49c9-bd77-198464717c52" (UID: "f50b9899-2b27-49c9-bd77-198464717c52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.044513 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd" (OuterVolumeSpecName: "kube-api-access-ncknd") pod "f50b9899-2b27-49c9-bd77-198464717c52" (UID: "f50b9899-2b27-49c9-bd77-198464717c52"). InnerVolumeSpecName "kube-api-access-ncknd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.080377 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f50b9899-2b27-49c9-bd77-198464717c52" (UID: "f50b9899-2b27-49c9-bd77-198464717c52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.132491 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncknd\" (UniqueName: \"kubernetes.io/projected/f50b9899-2b27-49c9-bd77-198464717c52-kube-api-access-ncknd\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.132562 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.132575 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f50b9899-2b27-49c9-bd77-198464717c52-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.726602 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmx4k" event={"ID":"f50b9899-2b27-49c9-bd77-198464717c52","Type":"ContainerDied","Data":"e13db079b052f9c7f1bff603487e5321fb2bb58bc0760a8e59e627e27b5ad0b5"} Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.726656 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmx4k" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.726660 4832 scope.go:117] "RemoveContainer" containerID="f4f05aa9fcb42f8b2cbf2180a8c7593b82e705585518a6294457fe5aeaecfdd8" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.742314 4832 scope.go:117] "RemoveContainer" containerID="1576adf6c80386c0c09416666e8ae1771ba6247eaa489f5aa7aa8d5ec411e91d" Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.753727 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.758137 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wmx4k"] Oct 03 00:36:56 crc kubenswrapper[4832]: I1003 00:36:56.770365 4832 scope.go:117] "RemoveContainer" containerID="bf033e3a59702e0f7b9797152adc596c468f9d1c9129b8c305e037c87e5f7bf9" Oct 03 00:36:57 crc kubenswrapper[4832]: I1003 00:36:57.393638 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50b9899-2b27-49c9-bd77-198464717c52" path="/var/lib/kubelet/pods/f50b9899-2b27-49c9-bd77-198464717c52/volumes" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.695107 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:37:16 crc kubenswrapper[4832]: E1003 00:37:16.695891 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="extract-content" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.695903 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="extract-content" Oct 03 00:37:16 crc kubenswrapper[4832]: E1003 00:37:16.695916 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="extract-utilities" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.695922 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="extract-utilities" Oct 03 00:37:16 crc kubenswrapper[4832]: E1003 00:37:16.695933 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="registry-server" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.695939 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="registry-server" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.696040 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50b9899-2b27-49c9-bd77-198464717c52" containerName="registry-server" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.696486 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.698138 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-86mpx" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.698695 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.698954 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.699152 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.699329 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.699646 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.700138 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.705788 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.811320 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.811739 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.811864 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.811955 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbtrl\" (UniqueName: \"kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.812057 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.812142 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.812224 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.913937 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914014 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914070 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914098 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbtrl\" (UniqueName: \"kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914148 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914172 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.914199 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.915170 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.921126 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.922098 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.923347 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.924220 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.934945 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbtrl\" (UniqueName: \"kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:16 crc kubenswrapper[4832]: I1003 00:37:16.939769 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-9xcc5\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:17 crc kubenswrapper[4832]: I1003 00:37:17.014154 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:37:17 crc kubenswrapper[4832]: I1003 00:37:17.216639 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:37:17 crc kubenswrapper[4832]: I1003 00:37:17.857058 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" event={"ID":"f410043a-d947-4f9e-99d2-31cffc97693f","Type":"ContainerStarted","Data":"cfb1166ca5c6990f2c0d92bc34201642480bcd3e4fdf35c6282e0b582e36abdf"} Oct 03 00:37:22 crc kubenswrapper[4832]: I1003 00:37:22.894792 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" event={"ID":"f410043a-d947-4f9e-99d2-31cffc97693f","Type":"ContainerStarted","Data":"ef7f077487ccc3693384df24c7474de5cc7d881136911ca45b63d6c0c1c32272"} Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.296718 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" podStartSLOduration=5.652910988 podStartE2EDuration="10.296693536s" podCreationTimestamp="2025-10-03 00:37:16 +0000 UTC" firstStartedPulling="2025-10-03 00:37:17.219728654 +0000 UTC m=+1830.481883760" lastFinishedPulling="2025-10-03 00:37:21.863511202 +0000 UTC m=+1835.125666308" observedRunningTime="2025-10-03 00:37:22.916521348 +0000 UTC m=+1836.178676464" watchObservedRunningTime="2025-10-03 00:37:26.296693536 +0000 UTC m=+1839.558848642" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.297811 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.299658 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.302047 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.302269 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.302662 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.303397 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-7t4md" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.303490 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.303395 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.303456 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.303849 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.318365 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343146 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-web-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343214 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgzk\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-kube-api-access-jmgzk\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343262 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-tls-assets\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343288 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343314 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343381 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config-out\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343414 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343469 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343503 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.343530 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.444474 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config-out\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445391 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445508 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445636 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445740 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445856 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-web-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.445967 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgzk\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-kube-api-access-jmgzk\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.446068 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-tls-assets\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.446152 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.446236 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: E1003 00:37:26.446445 4832 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Oct 03 00:37:26 crc kubenswrapper[4832]: E1003 00:37:26.446529 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls podName:0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2 nodeName:}" failed. No retries permitted until 2025-10-03 00:37:26.946504785 +0000 UTC m=+1840.208659891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2") : secret "default-prometheus-proxy-tls" not found Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.447466 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.447528 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.450888 4832 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.450969 4832 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d24da641b258589e0a6e4dd59f02e4c65649c4a1650522810185b8c5e63f920f/globalmount\"" pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.452420 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.452624 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.454398 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-web-config\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.455382 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-tls-assets\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.462814 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-config-out\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.465264 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgzk\" (UniqueName: \"kubernetes.io/projected/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-kube-api-access-jmgzk\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.497853 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-480211fa-796e-42c3-b6d4-aa6d910a0c68\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: I1003 00:37:26.952030 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:26 crc kubenswrapper[4832]: E1003 00:37:26.952196 4832 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Oct 03 00:37:26 crc kubenswrapper[4832]: E1003 00:37:26.952251 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls podName:0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2 nodeName:}" failed. No retries permitted until 2025-10-03 00:37:27.952237267 +0000 UTC m=+1841.214392373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2") : secret "default-prometheus-proxy-tls" not found Oct 03 00:37:27 crc kubenswrapper[4832]: I1003 00:37:27.964007 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:27 crc kubenswrapper[4832]: I1003 00:37:27.970399 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2\") " pod="service-telemetry/prometheus-default-0" Oct 03 00:37:28 crc kubenswrapper[4832]: I1003 00:37:28.121396 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Oct 03 00:37:28 crc kubenswrapper[4832]: I1003 00:37:28.531472 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Oct 03 00:37:28 crc kubenswrapper[4832]: I1003 00:37:28.954463 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerStarted","Data":"63f176b52fe1fd1e5b87cd9b40e7be3a2adde0526178136983272859f9b52e39"} Oct 03 00:37:32 crc kubenswrapper[4832]: I1003 00:37:32.984264 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerStarted","Data":"f88e3eec532d48e3c6765bdd152b481515996ab4725a88ac2876a6769ff91bf3"} Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.100840 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-8xhrr"] Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.101991 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.114372 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-8xhrr"] Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.286676 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvd7\" (UniqueName: \"kubernetes.io/projected/1fcde8b7-66a3-468b-a67d-784b8b5f8e21-kube-api-access-spvd7\") pod \"default-snmp-webhook-6856cfb745-8xhrr\" (UID: \"1fcde8b7-66a3-468b-a67d-784b8b5f8e21\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.388267 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvd7\" (UniqueName: \"kubernetes.io/projected/1fcde8b7-66a3-468b-a67d-784b8b5f8e21-kube-api-access-spvd7\") pod \"default-snmp-webhook-6856cfb745-8xhrr\" (UID: \"1fcde8b7-66a3-468b-a67d-784b8b5f8e21\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.421341 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvd7\" (UniqueName: \"kubernetes.io/projected/1fcde8b7-66a3-468b-a67d-784b8b5f8e21-kube-api-access-spvd7\") pod \"default-snmp-webhook-6856cfb745-8xhrr\" (UID: \"1fcde8b7-66a3-468b-a67d-784b8b5f8e21\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.431683 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" Oct 03 00:37:36 crc kubenswrapper[4832]: I1003 00:37:36.696909 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-8xhrr"] Oct 03 00:37:36 crc kubenswrapper[4832]: W1003 00:37:36.703724 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fcde8b7_66a3_468b_a67d_784b8b5f8e21.slice/crio-aa92d2450e14814557a448aee14ee4297907e793bfabf9339900c496091891e3 WatchSource:0}: Error finding container aa92d2450e14814557a448aee14ee4297907e793bfabf9339900c496091891e3: Status 404 returned error can't find the container with id aa92d2450e14814557a448aee14ee4297907e793bfabf9339900c496091891e3 Oct 03 00:37:37 crc kubenswrapper[4832]: I1003 00:37:37.016686 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" event={"ID":"1fcde8b7-66a3-468b-a67d-784b8b5f8e21","Type":"ContainerStarted","Data":"aa92d2450e14814557a448aee14ee4297907e793bfabf9339900c496091891e3"} Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.711473 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.713061 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.715283 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.715551 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.715603 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-hkl64" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.715557 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.716156 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.727225 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843160 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-web-config\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843236 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a756837-fae9-4644-9fed-56fe7576ea4c-config-out\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843270 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-config-volume\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843308 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-db04db1e-61c3-4d28-a471-f65082884a9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db04db1e-61c3-4d28-a471-f65082884a9c\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843332 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-tls-assets\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843446 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843522 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwds\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-kube-api-access-dmwds\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.843565 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.944946 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-db04db1e-61c3-4d28-a471-f65082884a9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db04db1e-61c3-4d28-a471-f65082884a9c\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.944987 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-tls-assets\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945037 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945109 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwds\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-kube-api-access-dmwds\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945143 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945172 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-web-config\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945203 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a756837-fae9-4644-9fed-56fe7576ea4c-config-out\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.945226 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-config-volume\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: E1003 00:37:39.948111 4832 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:39 crc kubenswrapper[4832]: E1003 00:37:39.948172 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls podName:7a756837-fae9-4644-9fed-56fe7576ea4c nodeName:}" failed. No retries permitted until 2025-10-03 00:37:40.448154781 +0000 UTC m=+1853.710309887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "7a756837-fae9-4644-9fed-56fe7576ea4c") : secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.951560 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-tls-assets\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.951741 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-web-config\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.952447 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a756837-fae9-4644-9fed-56fe7576ea4c-config-out\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.953031 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-config-volume\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.957787 4832 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.957821 4832 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-db04db1e-61c3-4d28-a471-f65082884a9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db04db1e-61c3-4d28-a471-f65082884a9c\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/00cc1fb8310ad96a404829e9c6c0cf0a279dc1d5d6960e7a2a6a2fed99b8a748/globalmount\"" pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.962105 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwds\" (UniqueName: \"kubernetes.io/projected/7a756837-fae9-4644-9fed-56fe7576ea4c-kube-api-access-dmwds\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.968314 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:39 crc kubenswrapper[4832]: I1003 00:37:39.988327 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-db04db1e-61c3-4d28-a471-f65082884a9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db04db1e-61c3-4d28-a471-f65082884a9c\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:40 crc kubenswrapper[4832]: I1003 00:37:40.036216 4832 generic.go:334] "Generic (PLEG): container finished" podID="0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2" containerID="f88e3eec532d48e3c6765bdd152b481515996ab4725a88ac2876a6769ff91bf3" exitCode=0 Oct 03 00:37:40 crc kubenswrapper[4832]: I1003 00:37:40.036255 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerDied","Data":"f88e3eec532d48e3c6765bdd152b481515996ab4725a88ac2876a6769ff91bf3"} Oct 03 00:37:40 crc kubenswrapper[4832]: I1003 00:37:40.454177 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:40 crc kubenswrapper[4832]: E1003 00:37:40.454388 4832 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:40 crc kubenswrapper[4832]: E1003 00:37:40.454694 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls podName:7a756837-fae9-4644-9fed-56fe7576ea4c nodeName:}" failed. No retries permitted until 2025-10-03 00:37:41.454664017 +0000 UTC m=+1854.716819123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "7a756837-fae9-4644-9fed-56fe7576ea4c") : secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:41 crc kubenswrapper[4832]: I1003 00:37:41.468243 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:41 crc kubenswrapper[4832]: E1003 00:37:41.468443 4832 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:41 crc kubenswrapper[4832]: E1003 00:37:41.468521 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls podName:7a756837-fae9-4644-9fed-56fe7576ea4c nodeName:}" failed. No retries permitted until 2025-10-03 00:37:43.468501245 +0000 UTC m=+1856.730656351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "7a756837-fae9-4644-9fed-56fe7576ea4c") : secret "default-alertmanager-proxy-tls" not found Oct 03 00:37:43 crc kubenswrapper[4832]: I1003 00:37:43.065369 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" event={"ID":"1fcde8b7-66a3-468b-a67d-784b8b5f8e21","Type":"ContainerStarted","Data":"6fcb1545f0f6dabaad681c2dfa25afc9e0e43f5debf680428a62b06aeb5ca72a"} Oct 03 00:37:43 crc kubenswrapper[4832]: I1003 00:37:43.079873 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-8xhrr" podStartSLOduration=1.2776431160000001 podStartE2EDuration="7.079856147s" podCreationTimestamp="2025-10-03 00:37:36 +0000 UTC" firstStartedPulling="2025-10-03 00:37:36.706827983 +0000 UTC m=+1849.968983089" lastFinishedPulling="2025-10-03 00:37:42.509041014 +0000 UTC m=+1855.771196120" observedRunningTime="2025-10-03 00:37:43.076710757 +0000 UTC m=+1856.338865863" watchObservedRunningTime="2025-10-03 00:37:43.079856147 +0000 UTC m=+1856.342011253" Oct 03 00:37:43 crc kubenswrapper[4832]: I1003 00:37:43.497640 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:43 crc kubenswrapper[4832]: I1003 00:37:43.503906 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a756837-fae9-4644-9fed-56fe7576ea4c-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"7a756837-fae9-4644-9fed-56fe7576ea4c\") " pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:43 crc kubenswrapper[4832]: I1003 00:37:43.637224 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Oct 03 00:37:44 crc kubenswrapper[4832]: I1003 00:37:44.033358 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Oct 03 00:37:44 crc kubenswrapper[4832]: W1003 00:37:44.042162 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a756837_fae9_4644_9fed_56fe7576ea4c.slice/crio-368f3c3d88cbecce23264fe84acc2525503934cd96ed9cf9e3cf8bd03c363659 WatchSource:0}: Error finding container 368f3c3d88cbecce23264fe84acc2525503934cd96ed9cf9e3cf8bd03c363659: Status 404 returned error can't find the container with id 368f3c3d88cbecce23264fe84acc2525503934cd96ed9cf9e3cf8bd03c363659 Oct 03 00:37:44 crc kubenswrapper[4832]: I1003 00:37:44.073815 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerStarted","Data":"368f3c3d88cbecce23264fe84acc2525503934cd96ed9cf9e3cf8bd03c363659"} Oct 03 00:37:46 crc kubenswrapper[4832]: I1003 00:37:46.088099 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerStarted","Data":"7fe2f08baa04801db3051806c9182d61e42f304e7f94b5826f8d62a6dd38c275"} Oct 03 00:37:47 crc kubenswrapper[4832]: I1003 00:37:47.096159 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerStarted","Data":"5ae25cdc89b069675e09faead863b4702c90518da04499ff2cf01387f34511b1"} Oct 03 00:37:49 crc kubenswrapper[4832]: I1003 00:37:49.110232 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerStarted","Data":"4f945c37228b526f6696bff82c88728564d8076c883b073b7942eae3f7da6531"} Oct 03 00:37:52 crc kubenswrapper[4832]: I1003 00:37:52.140200 4832 generic.go:334] "Generic (PLEG): container finished" podID="7a756837-fae9-4644-9fed-56fe7576ea4c" containerID="7fe2f08baa04801db3051806c9182d61e42f304e7f94b5826f8d62a6dd38c275" exitCode=0 Oct 03 00:37:52 crc kubenswrapper[4832]: I1003 00:37:52.140297 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerDied","Data":"7fe2f08baa04801db3051806c9182d61e42f304e7f94b5826f8d62a6dd38c275"} Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.002498 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj"] Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.003655 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.006082 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.006323 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-jcxmt" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.007362 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.007472 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.010946 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj"] Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.122705 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.122774 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/88373a98-b547-43d0-982f-69c1fd9f5507-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.122796 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/88373a98-b547-43d0-982f-69c1fd9f5507-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.122815 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.122846 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jwf\" (UniqueName: \"kubernetes.io/projected/88373a98-b547-43d0-982f-69c1fd9f5507-kube-api-access-k8jwf\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.224622 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/88373a98-b547-43d0-982f-69c1fd9f5507-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.224692 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/88373a98-b547-43d0-982f-69c1fd9f5507-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.224721 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.224779 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jwf\" (UniqueName: \"kubernetes.io/projected/88373a98-b547-43d0-982f-69c1fd9f5507-kube-api-access-k8jwf\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.224880 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: E1003 00:37:53.224944 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Oct 03 00:37:53 crc kubenswrapper[4832]: E1003 00:37:53.225036 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls podName:88373a98-b547-43d0-982f-69c1fd9f5507 nodeName:}" failed. No retries permitted until 2025-10-03 00:37:53.725013278 +0000 UTC m=+1866.987168434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" (UID: "88373a98-b547-43d0-982f-69c1fd9f5507") : secret "default-cloud1-coll-meter-proxy-tls" not found Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.225190 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/88373a98-b547-43d0-982f-69c1fd9f5507-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.225787 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/88373a98-b547-43d0-982f-69c1fd9f5507-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.243909 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.247564 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jwf\" (UniqueName: \"kubernetes.io/projected/88373a98-b547-43d0-982f-69c1fd9f5507-kube-api-access-k8jwf\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: I1003 00:37:53.731991 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:53 crc kubenswrapper[4832]: E1003 00:37:53.732409 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Oct 03 00:37:53 crc kubenswrapper[4832]: E1003 00:37:53.732557 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls podName:88373a98-b547-43d0-982f-69c1fd9f5507 nodeName:}" failed. No retries permitted until 2025-10-03 00:37:54.732540367 +0000 UTC m=+1867.994695473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" (UID: "88373a98-b547-43d0-982f-69c1fd9f5507") : secret "default-cloud1-coll-meter-proxy-tls" not found Oct 03 00:37:54 crc kubenswrapper[4832]: I1003 00:37:54.746800 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:54 crc kubenswrapper[4832]: I1003 00:37:54.751733 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/88373a98-b547-43d0-982f-69c1fd9f5507-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj\" (UID: \"88373a98-b547-43d0-982f-69c1fd9f5507\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:54 crc kubenswrapper[4832]: I1003 00:37:54.818848 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" Oct 03 00:37:54 crc kubenswrapper[4832]: I1003 00:37:54.994028 4832 scope.go:117] "RemoveContainer" containerID="cce5a13c3b0bdffe025c7ccda6f7b04bbae7a6c92f96992063e2f0b6eeeda214" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.146490 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv"] Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.157607 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv"] Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.157739 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.162115 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.162335 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.266545 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d0adf6e-7076-4af6-8e81-54e16611154f-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.266705 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.266761 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4d0adf6e-7076-4af6-8e81-54e16611154f-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.266858 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.266894 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dc56\" (UniqueName: \"kubernetes.io/projected/4d0adf6e-7076-4af6-8e81-54e16611154f-kube-api-access-5dc56\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.368803 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d0adf6e-7076-4af6-8e81-54e16611154f-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.368872 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.368893 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4d0adf6e-7076-4af6-8e81-54e16611154f-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.368930 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.368951 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dc56\" (UniqueName: \"kubernetes.io/projected/4d0adf6e-7076-4af6-8e81-54e16611154f-kube-api-access-5dc56\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.369641 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d0adf6e-7076-4af6-8e81-54e16611154f-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: E1003 00:37:56.370346 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Oct 03 00:37:56 crc kubenswrapper[4832]: E1003 00:37:56.370402 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls podName:4d0adf6e-7076-4af6-8e81-54e16611154f nodeName:}" failed. No retries permitted until 2025-10-03 00:37:56.870387491 +0000 UTC m=+1870.132542597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" (UID: "4d0adf6e-7076-4af6-8e81-54e16611154f") : secret "default-cloud1-ceil-meter-proxy-tls" not found Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.370956 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4d0adf6e-7076-4af6-8e81-54e16611154f-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.373347 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.394767 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dc56\" (UniqueName: \"kubernetes.io/projected/4d0adf6e-7076-4af6-8e81-54e16611154f-kube-api-access-5dc56\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.753780 4832 scope.go:117] "RemoveContainer" containerID="0b9480d4ca64492ad1eb375d771f8da130972cd257da020961461d420ead10f2" Oct 03 00:37:56 crc kubenswrapper[4832]: I1003 00:37:56.878148 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:56 crc kubenswrapper[4832]: E1003 00:37:56.878646 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Oct 03 00:37:56 crc kubenswrapper[4832]: E1003 00:37:56.878699 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls podName:4d0adf6e-7076-4af6-8e81-54e16611154f nodeName:}" failed. No retries permitted until 2025-10-03 00:37:57.878685793 +0000 UTC m=+1871.140840899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" (UID: "4d0adf6e-7076-4af6-8e81-54e16611154f") : secret "default-cloud1-ceil-meter-proxy-tls" not found Oct 03 00:37:57 crc kubenswrapper[4832]: I1003 00:37:57.209809 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj"] Oct 03 00:37:57 crc kubenswrapper[4832]: W1003 00:37:57.429348 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88373a98_b547_43d0_982f_69c1fd9f5507.slice/crio-0731d603de70fdb4f90dd3abc37d028b765d2ffd4127a1a0c19196731d16148e WatchSource:0}: Error finding container 0731d603de70fdb4f90dd3abc37d028b765d2ffd4127a1a0c19196731d16148e: Status 404 returned error can't find the container with id 0731d603de70fdb4f90dd3abc37d028b765d2ffd4127a1a0c19196731d16148e Oct 03 00:37:57 crc kubenswrapper[4832]: I1003 00:37:57.895241 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:57 crc kubenswrapper[4832]: I1003 00:37:57.901658 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d0adf6e-7076-4af6-8e81-54e16611154f-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv\" (UID: \"4d0adf6e-7076-4af6-8e81-54e16611154f\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:57 crc kubenswrapper[4832]: I1003 00:37:57.986950 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" Oct 03 00:37:58 crc kubenswrapper[4832]: I1003 00:37:58.202099 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"0731d603de70fdb4f90dd3abc37d028b765d2ffd4127a1a0c19196731d16148e"} Oct 03 00:37:58 crc kubenswrapper[4832]: I1003 00:37:58.471085 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv"] Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.211393 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"d0a0b4458d402dd0d438dbe32abb120963e77d047168a5193cd5ff0136160a43"} Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.214471 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2","Type":"ContainerStarted","Data":"a500704b73b5beb45d8e0369bcc3a849316a7b22ac9d8c4b4a98712dccd33c6a"} Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.216497 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"906d06c4717446da6e50266c9e3802f66134c8741c0fd07d2b1a13b2fca51fde"} Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.221419 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerStarted","Data":"883293e95c22821e1b01008680f0b074c25b2b44d46d0f3a05408d3b86467ec5"} Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.253148 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.992462112 podStartE2EDuration="34.253130671s" podCreationTimestamp="2025-10-03 00:37:25 +0000 UTC" firstStartedPulling="2025-10-03 00:37:28.541479835 +0000 UTC m=+1841.803634941" lastFinishedPulling="2025-10-03 00:37:57.802148394 +0000 UTC m=+1871.064303500" observedRunningTime="2025-10-03 00:37:59.250034782 +0000 UTC m=+1872.512189908" watchObservedRunningTime="2025-10-03 00:37:59.253130671 +0000 UTC m=+1872.515285777" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.643733 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p"] Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.646294 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.648913 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.649474 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.651297 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p"] Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.731967 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c76c3820-5067-4afd-8954-5124625f3462-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.732021 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.732049 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c76c3820-5067-4afd-8954-5124625f3462-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.732154 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.732224 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dhl\" (UniqueName: \"kubernetes.io/projected/c76c3820-5067-4afd-8954-5124625f3462-kube-api-access-b5dhl\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.833513 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.833569 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c76c3820-5067-4afd-8954-5124625f3462-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.833615 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.833648 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dhl\" (UniqueName: \"kubernetes.io/projected/c76c3820-5067-4afd-8954-5124625f3462-kube-api-access-b5dhl\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.833738 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c76c3820-5067-4afd-8954-5124625f3462-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: E1003 00:37:59.833988 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Oct 03 00:37:59 crc kubenswrapper[4832]: E1003 00:37:59.834062 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls podName:c76c3820-5067-4afd-8954-5124625f3462 nodeName:}" failed. No retries permitted until 2025-10-03 00:38:00.334044566 +0000 UTC m=+1873.596199672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" (UID: "c76c3820-5067-4afd-8954-5124625f3462") : secret "default-cloud1-sens-meter-proxy-tls" not found Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.834598 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c76c3820-5067-4afd-8954-5124625f3462-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.834612 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c76c3820-5067-4afd-8954-5124625f3462-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.883626 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:37:59 crc kubenswrapper[4832]: I1003 00:37:59.883867 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dhl\" (UniqueName: \"kubernetes.io/projected/c76c3820-5067-4afd-8954-5124625f3462-kube-api-access-b5dhl\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:38:00 crc kubenswrapper[4832]: I1003 00:38:00.230501 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerStarted","Data":"cc49d357abc7cd50536c7365f14ee151366ca7c92dda80d3c8e1d4e89cc6b95b"} Oct 03 00:38:00 crc kubenswrapper[4832]: I1003 00:38:00.233465 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"870df230c07bd47bc771b6781d30db978784573ba073c4be65dc079bda845845"} Oct 03 00:38:00 crc kubenswrapper[4832]: I1003 00:38:00.340628 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:38:00 crc kubenswrapper[4832]: E1003 00:38:00.341383 4832 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Oct 03 00:38:00 crc kubenswrapper[4832]: E1003 00:38:00.341473 4832 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls podName:c76c3820-5067-4afd-8954-5124625f3462 nodeName:}" failed. No retries permitted until 2025-10-03 00:38:01.341451011 +0000 UTC m=+1874.603606197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" (UID: "c76c3820-5067-4afd-8954-5124625f3462") : secret "default-cloud1-sens-meter-proxy-tls" not found Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.247580 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"7a756837-fae9-4644-9fed-56fe7576ea4c","Type":"ContainerStarted","Data":"99ff5d238c8750f37fcbcb68c6f4b7b7587a27fd127b791805b308b14f46d7c0"} Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.357594 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.363817 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c76c3820-5067-4afd-8954-5124625f3462-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p\" (UID: \"c76c3820-5067-4afd-8954-5124625f3462\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.473440 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.727160 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.28728863 podStartE2EDuration="23.727143698s" podCreationTimestamp="2025-10-03 00:37:38 +0000 UTC" firstStartedPulling="2025-10-03 00:37:52.14241637 +0000 UTC m=+1865.404571476" lastFinishedPulling="2025-10-03 00:38:00.582271438 +0000 UTC m=+1873.844426544" observedRunningTime="2025-10-03 00:38:01.299233685 +0000 UTC m=+1874.561388801" watchObservedRunningTime="2025-10-03 00:38:01.727143698 +0000 UTC m=+1874.989298794" Oct 03 00:38:01 crc kubenswrapper[4832]: I1003 00:38:01.731791 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p"] Oct 03 00:38:01 crc kubenswrapper[4832]: W1003 00:38:01.750591 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc76c3820_5067_4afd_8954_5124625f3462.slice/crio-b211f8e9acc446fbe52117d2d1e161bd403ac7f47147fd3b30bdde39957a41cf WatchSource:0}: Error finding container b211f8e9acc446fbe52117d2d1e161bd403ac7f47147fd3b30bdde39957a41cf: Status 404 returned error can't find the container with id b211f8e9acc446fbe52117d2d1e161bd403ac7f47147fd3b30bdde39957a41cf Oct 03 00:38:02 crc kubenswrapper[4832]: I1003 00:38:02.258378 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"b211f8e9acc446fbe52117d2d1e161bd403ac7f47147fd3b30bdde39957a41cf"} Oct 03 00:38:03 crc kubenswrapper[4832]: I1003 00:38:03.122205 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Oct 03 00:38:06 crc kubenswrapper[4832]: I1003 00:38:06.301094 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"5407c88691d2c6957b0a9d4be854793050dc8d09ec135b1b669d8b7b30c85b14"} Oct 03 00:38:06 crc kubenswrapper[4832]: I1003 00:38:06.304257 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"20a54f8e303a53eb2f314adae3548ac632784477780293283d0bfc3f3a03ee12"} Oct 03 00:38:06 crc kubenswrapper[4832]: I1003 00:38:06.304299 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"332ddf388847ee2cf48a020071fe013cb6afb05e09cda0348c3b9f2b2dbb1ce9"} Oct 03 00:38:06 crc kubenswrapper[4832]: I1003 00:38:06.308167 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"e197b466155eb56f654e1b6ebe340d69b2474b14f67acf3efe1e05b8a9b1fe25"} Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.175198 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj"] Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.176869 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.179093 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.179143 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.185141 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj"] Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.336703 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.336772 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.336860 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvf7\" (UniqueName: \"kubernetes.io/projected/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-kube-api-access-9vvf7\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.337017 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.438389 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvf7\" (UniqueName: \"kubernetes.io/projected/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-kube-api-access-9vvf7\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.438472 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.438513 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.438535 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.439332 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.439498 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.446654 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.462198 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvf7\" (UniqueName: \"kubernetes.io/projected/d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd-kube-api-access-9vvf7\") pod \"default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj\" (UID: \"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:07 crc kubenswrapper[4832]: I1003 00:38:07.516041 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.138212 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h"] Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.140366 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.147532 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.155315 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h"] Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.250647 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/bfbd463d-c0ee-40b4-b7af-d363939c485f-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.250726 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4l6t\" (UniqueName: \"kubernetes.io/projected/bfbd463d-c0ee-40b4-b7af-d363939c485f-kube-api-access-m4l6t\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.250857 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfbd463d-c0ee-40b4-b7af-d363939c485f-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.250923 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/bfbd463d-c0ee-40b4-b7af-d363939c485f-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.352145 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4l6t\" (UniqueName: \"kubernetes.io/projected/bfbd463d-c0ee-40b4-b7af-d363939c485f-kube-api-access-m4l6t\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.352241 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfbd463d-c0ee-40b4-b7af-d363939c485f-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.352284 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/bfbd463d-c0ee-40b4-b7af-d363939c485f-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.352338 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/bfbd463d-c0ee-40b4-b7af-d363939c485f-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.352955 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfbd463d-c0ee-40b4-b7af-d363939c485f-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.353579 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/bfbd463d-c0ee-40b4-b7af-d363939c485f-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.363993 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/bfbd463d-c0ee-40b4-b7af-d363939c485f-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.378486 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4l6t\" (UniqueName: \"kubernetes.io/projected/bfbd463d-c0ee-40b4-b7af-d363939c485f-kube-api-access-m4l6t\") pod \"default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h\" (UID: \"bfbd463d-c0ee-40b4-b7af-d363939c485f\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:08 crc kubenswrapper[4832]: I1003 00:38:08.482142 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" Oct 03 00:38:13 crc kubenswrapper[4832]: I1003 00:38:13.122594 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Oct 03 00:38:13 crc kubenswrapper[4832]: I1003 00:38:13.162436 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Oct 03 00:38:13 crc kubenswrapper[4832]: I1003 00:38:13.393059 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Oct 03 00:38:19 crc kubenswrapper[4832]: I1003 00:38:19.174850 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj"] Oct 03 00:38:19 crc kubenswrapper[4832]: I1003 00:38:19.238367 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h"] Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.288401 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.288566 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,Command:[],Args:[-config /etc/sg-core/sg-core.conf.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:prom-https,HostPort:0,ContainerPort:8083,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:MY_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sg-core-config,ReadOnly:true,MountPath:/etc/sg-core/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k8jwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj_service-telemetry(88373a98-b547-43d0-982f-69c1fd9f5507): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.289784 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podUID="88373a98-b547-43d0-982f-69c1fd9f5507" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.363542 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.363734 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,Command:[],Args:[-config /etc/sg-core/sg-core.conf.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:prom-https,HostPort:0,ContainerPort:8083,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:MY_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sg-core-config,ReadOnly:true,MountPath:/etc/sg-core/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dc56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv_service-telemetry(4d0adf6e-7076-4af6-8e81-54e16611154f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.365039 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podUID="4d0adf6e-7076-4af6-8e81-54e16611154f" Oct 03 00:38:19 crc kubenswrapper[4832]: I1003 00:38:19.401855 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerStarted","Data":"7d0ace5fb3f060ea031f7500b58cc7c9db729775b7f0206cd94a9d892f16590e"} Oct 03 00:38:19 crc kubenswrapper[4832]: I1003 00:38:19.403278 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerStarted","Data":"7f098756dea830f24da3f258d73a10782d2920afc2f6f0dbd577bccd72211cea"} Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.405167 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podUID="4d0adf6e-7076-4af6-8e81-54e16611154f" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.405214 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podUID="88373a98-b547-43d0-982f-69c1fd9f5507" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.455127 4832 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.455274 4832 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,Command:[],Args:[-config /etc/sg-core/sg-core.conf.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:prom-https,HostPort:0,ContainerPort:8083,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:MY_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sg-core-config,ReadOnly:true,MountPath:/etc/sg-core/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b5dhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p_service-telemetry(c76c3820-5067-4afd-8954-5124625f3462): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 00:38:19 crc kubenswrapper[4832]: E1003 00:38:19.458085 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podUID="c76c3820-5067-4afd-8954-5124625f3462" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.127579 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.128170 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" podUID="f410043a-d947-4f9e-99d2-31cffc97693f" containerName="default-interconnect" containerID="cri-o://ef7f077487ccc3693384df24c7474de5cc7d881136911ca45b63d6c0c1c32272" gracePeriod=30 Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.415700 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerStarted","Data":"282ce77106a8469f2bc2af98c6ed9757bf4acf2b0ee1e44bc5d0c574872f4827"} Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.416028 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerStarted","Data":"bea90df3fa7d5d2d51a8084b8e7459a2036045179e6072db462e5b017b2e7004"} Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.419005 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerStarted","Data":"c3b26a936f2de613a014ab1d2da256cd6e27b85b84473112e21ad2f30f52d876"} Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.419047 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerStarted","Data":"7a8921680218e1c5410939affe5550db1f4c1f5b6ce1265c269cf4a2fbf00ef4"} Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.420279 4832 generic.go:334] "Generic (PLEG): container finished" podID="f410043a-d947-4f9e-99d2-31cffc97693f" containerID="ef7f077487ccc3693384df24c7474de5cc7d881136911ca45b63d6c0c1c32272" exitCode=0 Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.420326 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" event={"ID":"f410043a-d947-4f9e-99d2-31cffc97693f","Type":"ContainerDied","Data":"ef7f077487ccc3693384df24c7474de5cc7d881136911ca45b63d6c0c1c32272"} Oct 03 00:38:20 crc kubenswrapper[4832]: E1003 00:38:20.423652 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podUID="c76c3820-5067-4afd-8954-5124625f3462" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.436953 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" podStartSLOduration=13.017699999 podStartE2EDuration="13.436907706s" podCreationTimestamp="2025-10-03 00:38:07 +0000 UTC" firstStartedPulling="2025-10-03 00:38:19.177302851 +0000 UTC m=+1892.439457947" lastFinishedPulling="2025-10-03 00:38:19.596510548 +0000 UTC m=+1892.858665654" observedRunningTime="2025-10-03 00:38:20.433184819 +0000 UTC m=+1893.695339935" watchObservedRunningTime="2025-10-03 00:38:20.436907706 +0000 UTC m=+1893.699062812" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.481556 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" podStartSLOduration=12.127513828 podStartE2EDuration="12.481530502s" podCreationTimestamp="2025-10-03 00:38:08 +0000 UTC" firstStartedPulling="2025-10-03 00:38:19.243238128 +0000 UTC m=+1892.505393224" lastFinishedPulling="2025-10-03 00:38:19.597254792 +0000 UTC m=+1892.859409898" observedRunningTime="2025-10-03 00:38:20.476675389 +0000 UTC m=+1893.738830505" watchObservedRunningTime="2025-10-03 00:38:20.481530502 +0000 UTC m=+1893.743685608" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.530796 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644640 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644722 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbtrl\" (UniqueName: \"kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644802 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644833 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644890 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644915 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.644944 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config\") pod \"f410043a-d947-4f9e-99d2-31cffc97693f\" (UID: \"f410043a-d947-4f9e-99d2-31cffc97693f\") " Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.645593 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.651827 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl" (OuterVolumeSpecName: "kube-api-access-jbtrl") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "kube-api-access-jbtrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.651848 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.651992 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.652150 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.652201 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.652276 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "f410043a-d947-4f9e-99d2-31cffc97693f" (UID: "f410043a-d947-4f9e-99d2-31cffc97693f"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747050 4832 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747100 4832 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747115 4832 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747133 4832 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-sasl-users\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747145 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbtrl\" (UniqueName: \"kubernetes.io/projected/f410043a-d947-4f9e-99d2-31cffc97693f-kube-api-access-jbtrl\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747157 4832 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:20 crc kubenswrapper[4832]: I1003 00:38:20.747168 4832 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/f410043a-d947-4f9e-99d2-31cffc97693f-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.427965 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" event={"ID":"f410043a-d947-4f9e-99d2-31cffc97693f","Type":"ContainerDied","Data":"cfb1166ca5c6990f2c0d92bc34201642480bcd3e4fdf35c6282e0b582e36abdf"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.427992 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-9xcc5" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.428025 4832 scope.go:117] "RemoveContainer" containerID="ef7f077487ccc3693384df24c7474de5cc7d881136911ca45b63d6c0c1c32272" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.430716 4832 generic.go:334] "Generic (PLEG): container finished" podID="c76c3820-5067-4afd-8954-5124625f3462" containerID="20a54f8e303a53eb2f314adae3548ac632784477780293283d0bfc3f3a03ee12" exitCode=0 Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.430814 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerDied","Data":"20a54f8e303a53eb2f314adae3548ac632784477780293283d0bfc3f3a03ee12"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.431426 4832 scope.go:117] "RemoveContainer" containerID="20a54f8e303a53eb2f314adae3548ac632784477780293283d0bfc3f3a03ee12" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.433922 4832 generic.go:334] "Generic (PLEG): container finished" podID="4d0adf6e-7076-4af6-8e81-54e16611154f" containerID="e197b466155eb56f654e1b6ebe340d69b2474b14f67acf3efe1e05b8a9b1fe25" exitCode=0 Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.434003 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerDied","Data":"e197b466155eb56f654e1b6ebe340d69b2474b14f67acf3efe1e05b8a9b1fe25"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.434465 4832 scope.go:117] "RemoveContainer" containerID="e197b466155eb56f654e1b6ebe340d69b2474b14f67acf3efe1e05b8a9b1fe25" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.441580 4832 generic.go:334] "Generic (PLEG): container finished" podID="d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd" containerID="bea90df3fa7d5d2d51a8084b8e7459a2036045179e6072db462e5b017b2e7004" exitCode=0 Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.441637 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerDied","Data":"bea90df3fa7d5d2d51a8084b8e7459a2036045179e6072db462e5b017b2e7004"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.442062 4832 scope.go:117] "RemoveContainer" containerID="bea90df3fa7d5d2d51a8084b8e7459a2036045179e6072db462e5b017b2e7004" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.443872 4832 generic.go:334] "Generic (PLEG): container finished" podID="bfbd463d-c0ee-40b4-b7af-d363939c485f" containerID="7a8921680218e1c5410939affe5550db1f4c1f5b6ce1265c269cf4a2fbf00ef4" exitCode=0 Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.443959 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerDied","Data":"7a8921680218e1c5410939affe5550db1f4c1f5b6ce1265c269cf4a2fbf00ef4"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.444069 4832 scope.go:117] "RemoveContainer" containerID="7a8921680218e1c5410939affe5550db1f4c1f5b6ce1265c269cf4a2fbf00ef4" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.446477 4832 generic.go:334] "Generic (PLEG): container finished" podID="88373a98-b547-43d0-982f-69c1fd9f5507" containerID="5407c88691d2c6957b0a9d4be854793050dc8d09ec135b1b669d8b7b30c85b14" exitCode=0 Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.446510 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerDied","Data":"5407c88691d2c6957b0a9d4be854793050dc8d09ec135b1b669d8b7b30c85b14"} Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.446799 4832 scope.go:117] "RemoveContainer" containerID="5407c88691d2c6957b0a9d4be854793050dc8d09ec135b1b669d8b7b30c85b14" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.530881 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.536393 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-9xcc5"] Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.554809 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-tqtcw"] Oct 03 00:38:21 crc kubenswrapper[4832]: E1003 00:38:21.555161 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f410043a-d947-4f9e-99d2-31cffc97693f" containerName="default-interconnect" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.555183 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="f410043a-d947-4f9e-99d2-31cffc97693f" containerName="default-interconnect" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.555348 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="f410043a-d947-4f9e-99d2-31cffc97693f" containerName="default-interconnect" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.555867 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560259 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560307 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-86mpx" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560380 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560686 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560777 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.560834 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.564239 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.582222 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-tqtcw"] Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.668579 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-users\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.668734 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6nh\" (UniqueName: \"kubernetes.io/projected/1804aec4-102a-4c92-903d-f5b680d9548d-kube-api-access-mt6nh\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.668834 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.668912 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.668982 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.669104 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-config\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.669204 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: E1003 00:38:21.734373 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podUID="4d0adf6e-7076-4af6-8e81-54e16611154f" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770001 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770071 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-users\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770110 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6nh\" (UniqueName: \"kubernetes.io/projected/1804aec4-102a-4c92-903d-f5b680d9548d-kube-api-access-mt6nh\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770135 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770155 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770174 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.770191 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-config\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.775846 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.778415 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.780635 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.781210 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-config\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.784654 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.785869 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/1804aec4-102a-4c92-903d-f5b680d9548d-sasl-users\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.791833 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6nh\" (UniqueName: \"kubernetes.io/projected/1804aec4-102a-4c92-903d-f5b680d9548d-kube-api-access-mt6nh\") pod \"default-interconnect-68864d46cb-tqtcw\" (UID: \"1804aec4-102a-4c92-903d-f5b680d9548d\") " pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:21 crc kubenswrapper[4832]: I1003 00:38:21.873648 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" Oct 03 00:38:22 crc kubenswrapper[4832]: E1003 00:38:22.003566 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podUID="88373a98-b547-43d0-982f-69c1fd9f5507" Oct 03 00:38:22 crc kubenswrapper[4832]: E1003 00:38:22.025091 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podUID="c76c3820-5067-4afd-8954-5124625f3462" Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.331614 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-tqtcw"] Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.455478 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerStarted","Data":"14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23"} Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.465578 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"73331e8e06bb6c1d9adebb8f45db5b35afc15a7d43ef07fdc4ddbbf9faf6cba7"} Oct 03 00:38:22 crc kubenswrapper[4832]: E1003 00:38:22.472349 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podUID="88373a98-b547-43d0-982f-69c1fd9f5507" Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.475258 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"03aa8927196901672bc5ae16f0ef20df1211e3fbc57ce8377064240bb41189d0"} Oct 03 00:38:22 crc kubenswrapper[4832]: E1003 00:38:22.477730 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podUID="c76c3820-5067-4afd-8954-5124625f3462" Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.482683 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"446db2266f5f3f8f6cb80c2228c482b4e97f5a9b079119cf3d39c4cacb0b2115"} Oct 03 00:38:22 crc kubenswrapper[4832]: E1003 00:38:22.491434 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podUID="4d0adf6e-7076-4af6-8e81-54e16611154f" Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.492057 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerStarted","Data":"5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d"} Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.498945 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" event={"ID":"1804aec4-102a-4c92-903d-f5b680d9548d","Type":"ContainerStarted","Data":"010189dfeb4151194937e6fc5de7b49cad8f53ede5bc0fe0d836211308556987"} Oct 03 00:38:22 crc kubenswrapper[4832]: I1003 00:38:22.546779 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" podStartSLOduration=2.546742079 podStartE2EDuration="2.546742079s" podCreationTimestamp="2025-10-03 00:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 00:38:22.540799592 +0000 UTC m=+1895.802954728" watchObservedRunningTime="2025-10-03 00:38:22.546742079 +0000 UTC m=+1895.808897185" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.395962 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f410043a-d947-4f9e-99d2-31cffc97693f" path="/var/lib/kubelet/pods/f410043a-d947-4f9e-99d2-31cffc97693f/volumes" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.507734 4832 generic.go:334] "Generic (PLEG): container finished" podID="bfbd463d-c0ee-40b4-b7af-d363939c485f" containerID="14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23" exitCode=0 Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.507818 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerDied","Data":"14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.507880 4832 scope.go:117] "RemoveContainer" containerID="7a8921680218e1c5410939affe5550db1f4c1f5b6ce1265c269cf4a2fbf00ef4" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.508190 4832 scope.go:117] "RemoveContainer" containerID="14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23" Oct 03 00:38:23 crc kubenswrapper[4832]: E1003 00:38:23.508454 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h_service-telemetry(bfbd463d-c0ee-40b4-b7af-d363939c485f)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" podUID="bfbd463d-c0ee-40b4-b7af-d363939c485f" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.514973 4832 generic.go:334] "Generic (PLEG): container finished" podID="88373a98-b547-43d0-982f-69c1fd9f5507" containerID="73331e8e06bb6c1d9adebb8f45db5b35afc15a7d43ef07fdc4ddbbf9faf6cba7" exitCode=0 Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.515026 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerDied","Data":"73331e8e06bb6c1d9adebb8f45db5b35afc15a7d43ef07fdc4ddbbf9faf6cba7"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.515461 4832 scope.go:117] "RemoveContainer" containerID="73331e8e06bb6c1d9adebb8f45db5b35afc15a7d43ef07fdc4ddbbf9faf6cba7" Oct 03 00:38:23 crc kubenswrapper[4832]: E1003 00:38:23.517497 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj_service-telemetry(88373a98-b547-43d0-982f-69c1fd9f5507)\", failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"]" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podUID="88373a98-b547-43d0-982f-69c1fd9f5507" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.518834 4832 generic.go:334] "Generic (PLEG): container finished" podID="c76c3820-5067-4afd-8954-5124625f3462" containerID="03aa8927196901672bc5ae16f0ef20df1211e3fbc57ce8377064240bb41189d0" exitCode=0 Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.518885 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerDied","Data":"03aa8927196901672bc5ae16f0ef20df1211e3fbc57ce8377064240bb41189d0"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.519212 4832 scope.go:117] "RemoveContainer" containerID="03aa8927196901672bc5ae16f0ef20df1211e3fbc57ce8377064240bb41189d0" Oct 03 00:38:23 crc kubenswrapper[4832]: E1003 00:38:23.522373 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p_service-telemetry(c76c3820-5067-4afd-8954-5124625f3462)\", failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"]" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podUID="c76c3820-5067-4afd-8954-5124625f3462" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.530416 4832 generic.go:334] "Generic (PLEG): container finished" podID="4d0adf6e-7076-4af6-8e81-54e16611154f" containerID="446db2266f5f3f8f6cb80c2228c482b4e97f5a9b079119cf3d39c4cacb0b2115" exitCode=0 Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.530505 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerDied","Data":"446db2266f5f3f8f6cb80c2228c482b4e97f5a9b079119cf3d39c4cacb0b2115"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.531030 4832 scope.go:117] "RemoveContainer" containerID="446db2266f5f3f8f6cb80c2228c482b4e97f5a9b079119cf3d39c4cacb0b2115" Oct 03 00:38:23 crc kubenswrapper[4832]: E1003 00:38:23.532833 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv_service-telemetry(4d0adf6e-7076-4af6-8e81-54e16611154f)\", failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest\\\"\"]" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podUID="4d0adf6e-7076-4af6-8e81-54e16611154f" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.533611 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-tqtcw" event={"ID":"1804aec4-102a-4c92-903d-f5b680d9548d","Type":"ContainerStarted","Data":"3bc338d8a28b4cf9c4b297ec7ef321f4ccdaea147935ddc91cc33e0e6a91bc82"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.539305 4832 generic.go:334] "Generic (PLEG): container finished" podID="d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd" containerID="5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d" exitCode=0 Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.539347 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerDied","Data":"5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d"} Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.539856 4832 scope.go:117] "RemoveContainer" containerID="5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d" Oct 03 00:38:23 crc kubenswrapper[4832]: E1003 00:38:23.540153 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj_service-telemetry(d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" podUID="d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.572418 4832 scope.go:117] "RemoveContainer" containerID="5407c88691d2c6957b0a9d4be854793050dc8d09ec135b1b669d8b7b30c85b14" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.611679 4832 scope.go:117] "RemoveContainer" containerID="20a54f8e303a53eb2f314adae3548ac632784477780293283d0bfc3f3a03ee12" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.648000 4832 scope.go:117] "RemoveContainer" containerID="e197b466155eb56f654e1b6ebe340d69b2474b14f67acf3efe1e05b8a9b1fe25" Oct 03 00:38:23 crc kubenswrapper[4832]: I1003 00:38:23.682832 4832 scope.go:117] "RemoveContainer" containerID="bea90df3fa7d5d2d51a8084b8e7459a2036045179e6072db462e5b017b2e7004" Oct 03 00:38:24 crc kubenswrapper[4832]: I1003 00:38:24.549075 4832 scope.go:117] "RemoveContainer" containerID="5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d" Oct 03 00:38:24 crc kubenswrapper[4832]: E1003 00:38:24.549631 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj_service-telemetry(d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" podUID="d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd" Oct 03 00:38:24 crc kubenswrapper[4832]: I1003 00:38:24.554663 4832 scope.go:117] "RemoveContainer" containerID="14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23" Oct 03 00:38:24 crc kubenswrapper[4832]: E1003 00:38:24.554964 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h_service-telemetry(bfbd463d-c0ee-40b4-b7af-d363939c485f)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" podUID="bfbd463d-c0ee-40b4-b7af-d363939c485f" Oct 03 00:38:35 crc kubenswrapper[4832]: I1003 00:38:35.386848 4832 scope.go:117] "RemoveContainer" containerID="03aa8927196901672bc5ae16f0ef20df1211e3fbc57ce8377064240bb41189d0" Oct 03 00:38:35 crc kubenswrapper[4832]: I1003 00:38:35.639524 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"f1c60b4505aab6a815420f91de584513a6c46359ae8066fad798e8000b27e2ef"} Oct 03 00:38:36 crc kubenswrapper[4832]: I1003 00:38:36.386905 4832 scope.go:117] "RemoveContainer" containerID="5f84a769f71baf49f714793ba77d4dbb4de16cd6e4e7ff419ece08bf5f4ae11d" Oct 03 00:38:36 crc kubenswrapper[4832]: I1003 00:38:36.648680 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj" event={"ID":"d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd","Type":"ContainerStarted","Data":"a151d0bccbbb6ccc7dbd2f34815526ce1961e9bda04e9832fb1e2cfb1b93ce8c"} Oct 03 00:38:36 crc kubenswrapper[4832]: I1003 00:38:36.651367 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" event={"ID":"c76c3820-5067-4afd-8954-5124625f3462","Type":"ContainerStarted","Data":"2a639aef9f88ee947356057b1c86a366ddbd6c7cf159a832cebb944da5feea10"} Oct 03 00:38:36 crc kubenswrapper[4832]: I1003 00:38:36.689953 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p" podStartSLOduration=3.77018657 podStartE2EDuration="37.689929188s" podCreationTimestamp="2025-10-03 00:37:59 +0000 UTC" firstStartedPulling="2025-10-03 00:38:01.754315774 +0000 UTC m=+1875.016470880" lastFinishedPulling="2025-10-03 00:38:35.674058392 +0000 UTC m=+1908.936213498" observedRunningTime="2025-10-03 00:38:36.686156779 +0000 UTC m=+1909.948311885" watchObservedRunningTime="2025-10-03 00:38:36.689929188 +0000 UTC m=+1909.952084294" Oct 03 00:38:37 crc kubenswrapper[4832]: I1003 00:38:37.392202 4832 scope.go:117] "RemoveContainer" containerID="73331e8e06bb6c1d9adebb8f45db5b35afc15a7d43ef07fdc4ddbbf9faf6cba7" Oct 03 00:38:37 crc kubenswrapper[4832]: I1003 00:38:37.660866 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"8f7efcb2f23dbc1b93f3c1b4868f56c67272f6a276b38bb092920a2fb00b6b27"} Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.386545 4832 scope.go:117] "RemoveContainer" containerID="14b3cf20ae01205aa3be84fa77b1665aedb16a7744c2bb617e1559ac5a2b4e23" Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.386941 4832 scope.go:117] "RemoveContainer" containerID="446db2266f5f3f8f6cb80c2228c482b4e97f5a9b079119cf3d39c4cacb0b2115" Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.671365 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"12bedf0acce7f4ea0db4b56d93a71704eb44308d0aba8ef243879d260c71dd6a"} Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.673465 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h" event={"ID":"bfbd463d-c0ee-40b4-b7af-d363939c485f","Type":"ContainerStarted","Data":"1fe63bbe3b078dfa94e391fe51e7c08c790fb7a120082181990c3c28e33973fe"} Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.677850 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" event={"ID":"88373a98-b547-43d0-982f-69c1fd9f5507","Type":"ContainerStarted","Data":"84e802965ece8416cd20051bac34f9c5bf71e946176d0a19d5d3862362021f53"} Oct 03 00:38:38 crc kubenswrapper[4832]: I1003 00:38:38.710056 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj" podStartSLOduration=6.9945190329999996 podStartE2EDuration="46.710036113s" podCreationTimestamp="2025-10-03 00:37:52 +0000 UTC" firstStartedPulling="2025-10-03 00:37:57.957818251 +0000 UTC m=+1871.219973357" lastFinishedPulling="2025-10-03 00:38:37.673335331 +0000 UTC m=+1910.935490437" observedRunningTime="2025-10-03 00:38:38.705634985 +0000 UTC m=+1911.967790091" watchObservedRunningTime="2025-10-03 00:38:38.710036113 +0000 UTC m=+1911.972191219" Oct 03 00:38:39 crc kubenswrapper[4832]: I1003 00:38:39.686222 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" event={"ID":"4d0adf6e-7076-4af6-8e81-54e16611154f","Type":"ContainerStarted","Data":"d4cae7f59a56c50105c554fdf1f56fd756490b9adf3fa14c372f8a8fcd411d94"} Oct 03 00:38:39 crc kubenswrapper[4832]: I1003 00:38:39.705054 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv" podStartSLOduration=3.4687778590000002 podStartE2EDuration="43.705038083s" podCreationTimestamp="2025-10-03 00:37:56 +0000 UTC" firstStartedPulling="2025-10-03 00:37:58.479853061 +0000 UTC m=+1871.742008167" lastFinishedPulling="2025-10-03 00:38:38.716113285 +0000 UTC m=+1911.978268391" observedRunningTime="2025-10-03 00:38:39.703687099 +0000 UTC m=+1912.965842205" watchObservedRunningTime="2025-10-03 00:38:39.705038083 +0000 UTC m=+1912.967193189" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.156201 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.157450 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.160094 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.160305 4832 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.166063 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.247422 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-qdr-test-config\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.247470 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.247504 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqhxh\" (UniqueName: \"kubernetes.io/projected/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-kube-api-access-tqhxh\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.348708 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-qdr-test-config\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.348789 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.348820 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqhxh\" (UniqueName: \"kubernetes.io/projected/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-kube-api-access-tqhxh\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.351088 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-qdr-test-config\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.354866 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.366997 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqhxh\" (UniqueName: \"kubernetes.io/projected/24586eac-145b-4e1b-b7ce-1e6116a3d3d8-kube-api-access-tqhxh\") pod \"qdr-test\" (UID: \"24586eac-145b-4e1b-b7ce-1e6116a3d3d8\") " pod="service-telemetry/qdr-test" Oct 03 00:38:49 crc kubenswrapper[4832]: I1003 00:38:49.480254 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Oct 03 00:38:50 crc kubenswrapper[4832]: I1003 00:38:50.353233 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Oct 03 00:38:50 crc kubenswrapper[4832]: W1003 00:38:50.356197 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24586eac_145b_4e1b_b7ce_1e6116a3d3d8.slice/crio-f6187eba762932f5d47ea8462f589f9bcc5ab54fda245d79abad00476bd6a6e0 WatchSource:0}: Error finding container f6187eba762932f5d47ea8462f589f9bcc5ab54fda245d79abad00476bd6a6e0: Status 404 returned error can't find the container with id f6187eba762932f5d47ea8462f589f9bcc5ab54fda245d79abad00476bd6a6e0 Oct 03 00:38:50 crc kubenswrapper[4832]: I1003 00:38:50.815978 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"24586eac-145b-4e1b-b7ce-1e6116a3d3d8","Type":"ContainerStarted","Data":"f6187eba762932f5d47ea8462f589f9bcc5ab54fda245d79abad00476bd6a6e0"} Oct 03 00:38:56 crc kubenswrapper[4832]: I1003 00:38:56.396659 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:38:56 crc kubenswrapper[4832]: I1003 00:38:56.397498 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:38:56 crc kubenswrapper[4832]: I1003 00:38:56.856618 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"24586eac-145b-4e1b-b7ce-1e6116a3d3d8","Type":"ContainerStarted","Data":"40a92e26e165317681be3aa89ddda0e15cf712303159c6995601a91a68800582"} Oct 03 00:38:56 crc kubenswrapper[4832]: I1003 00:38:56.868863 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.059258349 podStartE2EDuration="7.868849847s" podCreationTimestamp="2025-10-03 00:38:49 +0000 UTC" firstStartedPulling="2025-10-03 00:38:50.358064117 +0000 UTC m=+1923.620219223" lastFinishedPulling="2025-10-03 00:38:56.167655615 +0000 UTC m=+1929.429810721" observedRunningTime="2025-10-03 00:38:56.868018061 +0000 UTC m=+1930.130173187" watchObservedRunningTime="2025-10-03 00:38:56.868849847 +0000 UTC m=+1930.131004953" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.145075 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-qfb9r"] Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.146337 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.148409 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.148466 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.149476 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.149678 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.150724 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.150795 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-qfb9r"] Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.151198 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166278 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nrd\" (UniqueName: \"kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166494 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166564 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166611 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166643 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166685 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.166776 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268619 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268704 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87nrd\" (UniqueName: \"kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268747 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268794 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268813 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268828 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.268848 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.269740 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.269993 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.270059 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.270715 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.270814 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.271070 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.287670 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nrd\" (UniqueName: \"kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd\") pod \"stf-smoketest-smoke1-qfb9r\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.462086 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.513065 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.513936 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.517554 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.574024 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t7rp\" (UniqueName: \"kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp\") pod \"curl\" (UID: \"74f724d9-3345-4991-bc41-2c52e89748f4\") " pod="service-telemetry/curl" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.675682 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t7rp\" (UniqueName: \"kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp\") pod \"curl\" (UID: \"74f724d9-3345-4991-bc41-2c52e89748f4\") " pod="service-telemetry/curl" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.693643 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t7rp\" (UniqueName: \"kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp\") pod \"curl\" (UID: \"74f724d9-3345-4991-bc41-2c52e89748f4\") " pod="service-telemetry/curl" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.862779 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Oct 03 00:38:57 crc kubenswrapper[4832]: I1003 00:38:57.884622 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-qfb9r"] Oct 03 00:38:58 crc kubenswrapper[4832]: I1003 00:38:58.034895 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Oct 03 00:38:58 crc kubenswrapper[4832]: W1003 00:38:58.041071 4832 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74f724d9_3345_4991_bc41_2c52e89748f4.slice/crio-4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110 WatchSource:0}: Error finding container 4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110: Status 404 returned error can't find the container with id 4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110 Oct 03 00:38:58 crc kubenswrapper[4832]: I1003 00:38:58.884481 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerStarted","Data":"3bd6921eeb07e9a7689b5b25b4b113f2b0dc931dfda1c9128aec79e09f804d05"} Oct 03 00:38:58 crc kubenswrapper[4832]: I1003 00:38:58.885923 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"74f724d9-3345-4991-bc41-2c52e89748f4","Type":"ContainerStarted","Data":"4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110"} Oct 03 00:38:59 crc kubenswrapper[4832]: I1003 00:38:59.894633 4832 generic.go:334] "Generic (PLEG): container finished" podID="74f724d9-3345-4991-bc41-2c52e89748f4" containerID="117710486e823a4c245cd4c667fde7529b2168d74ed1929294120808a18dce7c" exitCode=0 Oct 03 00:38:59 crc kubenswrapper[4832]: I1003 00:38:59.894684 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"74f724d9-3345-4991-bc41-2c52e89748f4","Type":"ContainerDied","Data":"117710486e823a4c245cd4c667fde7529b2168d74ed1929294120808a18dce7c"} Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.438011 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.546900 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t7rp\" (UniqueName: \"kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp\") pod \"74f724d9-3345-4991-bc41-2c52e89748f4\" (UID: \"74f724d9-3345-4991-bc41-2c52e89748f4\") " Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.565542 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp" (OuterVolumeSpecName: "kube-api-access-5t7rp") pod "74f724d9-3345-4991-bc41-2c52e89748f4" (UID: "74f724d9-3345-4991-bc41-2c52e89748f4"). InnerVolumeSpecName "kube-api-access-5t7rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.643990 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_74f724d9-3345-4991-bc41-2c52e89748f4/curl/0.log" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.649074 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t7rp\" (UniqueName: \"kubernetes.io/projected/74f724d9-3345-4991-bc41-2c52e89748f4-kube-api-access-5t7rp\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.915619 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"74f724d9-3345-4991-bc41-2c52e89748f4","Type":"ContainerDied","Data":"4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110"} Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.915660 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b97dbb8a09922c6c1bd32e960a576eb119185d9229e6b6a467aee548c690110" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.915666 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Oct 03 00:39:02 crc kubenswrapper[4832]: I1003 00:39:02.942650 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-8xhrr_1fcde8b7-66a3-468b-a67d-784b8b5f8e21/prometheus-webhook-snmp/0.log" Oct 03 00:39:08 crc kubenswrapper[4832]: I1003 00:39:08.951725 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerStarted","Data":"aeb010ffa4c4045a1fb9a86263747e5e6c3706df05311f74e3f3c316ef29443d"} Oct 03 00:39:16 crc kubenswrapper[4832]: I1003 00:39:16.003114 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerStarted","Data":"33fa9aee3ec5b34df69168e35e4b16918165d752af194614484bd71c326879ed"} Oct 03 00:39:16 crc kubenswrapper[4832]: I1003 00:39:16.020257 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" podStartSLOduration=2.027147387 podStartE2EDuration="19.020242904s" podCreationTimestamp="2025-10-03 00:38:57 +0000 UTC" firstStartedPulling="2025-10-03 00:38:57.900997996 +0000 UTC m=+1931.163153102" lastFinishedPulling="2025-10-03 00:39:14.894093513 +0000 UTC m=+1948.156248619" observedRunningTime="2025-10-03 00:39:16.01886139 +0000 UTC m=+1949.281016506" watchObservedRunningTime="2025-10-03 00:39:16.020242904 +0000 UTC m=+1949.282398010" Oct 03 00:39:26 crc kubenswrapper[4832]: I1003 00:39:26.396865 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:39:26 crc kubenswrapper[4832]: I1003 00:39:26.397384 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:39:33 crc kubenswrapper[4832]: I1003 00:39:33.078590 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-8xhrr_1fcde8b7-66a3-468b-a67d-784b8b5f8e21/prometheus-webhook-snmp/0.log" Oct 03 00:39:43 crc kubenswrapper[4832]: I1003 00:39:43.182641 4832 generic.go:334] "Generic (PLEG): container finished" podID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerID="aeb010ffa4c4045a1fb9a86263747e5e6c3706df05311f74e3f3c316ef29443d" exitCode=0 Oct 03 00:39:43 crc kubenswrapper[4832]: I1003 00:39:43.182710 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerDied","Data":"aeb010ffa4c4045a1fb9a86263747e5e6c3706df05311f74e3f3c316ef29443d"} Oct 03 00:39:43 crc kubenswrapper[4832]: I1003 00:39:43.183720 4832 scope.go:117] "RemoveContainer" containerID="aeb010ffa4c4045a1fb9a86263747e5e6c3706df05311f74e3f3c316ef29443d" Oct 03 00:39:47 crc kubenswrapper[4832]: I1003 00:39:47.213271 4832 generic.go:334] "Generic (PLEG): container finished" podID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerID="33fa9aee3ec5b34df69168e35e4b16918165d752af194614484bd71c326879ed" exitCode=0 Oct 03 00:39:47 crc kubenswrapper[4832]: I1003 00:39:47.213675 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerDied","Data":"33fa9aee3ec5b34df69168e35e4b16918165d752af194614484bd71c326879ed"} Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.474035 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.621303 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.621395 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.621498 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.621567 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.622108 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.622235 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.622319 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nrd\" (UniqueName: \"kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd\") pod \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\" (UID: \"54b2f71d-77a1-4e08-a815-9ab83ceffd54\") " Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.626533 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd" (OuterVolumeSpecName: "kube-api-access-87nrd") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "kube-api-access-87nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.642169 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.653608 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.653642 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.655157 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.656536 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.657255 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "54b2f71d-77a1-4e08-a815-9ab83ceffd54" (UID: "54b2f71d-77a1-4e08-a815-9ab83ceffd54"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724390 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87nrd\" (UniqueName: \"kubernetes.io/projected/54b2f71d-77a1-4e08-a815-9ab83ceffd54-kube-api-access-87nrd\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724432 4832 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724445 4832 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724456 4832 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724466 4832 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724476 4832 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-healthcheck-log\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:48 crc kubenswrapper[4832]: I1003 00:39:48.724485 4832 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/54b2f71d-77a1-4e08-a815-9ab83ceffd54-sensubility-config\") on node \"crc\" DevicePath \"\"" Oct 03 00:39:49 crc kubenswrapper[4832]: I1003 00:39:49.232661 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" event={"ID":"54b2f71d-77a1-4e08-a815-9ab83ceffd54","Type":"ContainerDied","Data":"3bd6921eeb07e9a7689b5b25b4b113f2b0dc931dfda1c9128aec79e09f804d05"} Oct 03 00:39:49 crc kubenswrapper[4832]: I1003 00:39:49.233249 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd6921eeb07e9a7689b5b25b4b113f2b0dc931dfda1c9128aec79e09f804d05" Oct 03 00:39:49 crc kubenswrapper[4832]: I1003 00:39:49.232777 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-qfb9r" Oct 03 00:39:50 crc kubenswrapper[4832]: I1003 00:39:50.508741 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-qfb9r_54b2f71d-77a1-4e08-a815-9ab83ceffd54/smoketest-collectd/0.log" Oct 03 00:39:50 crc kubenswrapper[4832]: I1003 00:39:50.841698 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-qfb9r_54b2f71d-77a1-4e08-a815-9ab83ceffd54/smoketest-ceilometer/0.log" Oct 03 00:39:51 crc kubenswrapper[4832]: I1003 00:39:51.193861 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-tqtcw_1804aec4-102a-4c92-903d-f5b680d9548d/default-interconnect/0.log" Oct 03 00:39:51 crc kubenswrapper[4832]: I1003 00:39:51.481311 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj_88373a98-b547-43d0-982f-69c1fd9f5507/bridge/2.log" Oct 03 00:39:51 crc kubenswrapper[4832]: I1003 00:39:51.795646 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-jvpfj_88373a98-b547-43d0-982f-69c1fd9f5507/sg-core/0.log" Oct 03 00:39:52 crc kubenswrapper[4832]: I1003 00:39:52.089826 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj_d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd/bridge/2.log" Oct 03 00:39:52 crc kubenswrapper[4832]: I1003 00:39:52.351604 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-5c89775d54-8f5sj_d30f9d9a-e3d9-4ea2-a2bb-7cae3b2f28fd/sg-core/0.log" Oct 03 00:39:52 crc kubenswrapper[4832]: I1003 00:39:52.676313 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv_4d0adf6e-7076-4af6-8e81-54e16611154f/bridge/2.log" Oct 03 00:39:52 crc kubenswrapper[4832]: I1003 00:39:52.955374 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-n2mcv_4d0adf6e-7076-4af6-8e81-54e16611154f/sg-core/0.log" Oct 03 00:39:53 crc kubenswrapper[4832]: I1003 00:39:53.242316 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h_bfbd463d-c0ee-40b4-b7af-d363939c485f/bridge/2.log" Oct 03 00:39:53 crc kubenswrapper[4832]: I1003 00:39:53.516976 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-596fcd4567-w6w4h_bfbd463d-c0ee-40b4-b7af-d363939c485f/sg-core/0.log" Oct 03 00:39:53 crc kubenswrapper[4832]: I1003 00:39:53.812961 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p_c76c3820-5067-4afd-8954-5124625f3462/bridge/2.log" Oct 03 00:39:54 crc kubenswrapper[4832]: I1003 00:39:54.113463 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-9x79p_c76c3820-5067-4afd-8954-5124625f3462/sg-core/0.log" Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.358745 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-547fdfd8bd-xdfmh_7eb70cec-c2c0-46a3-9156-8f652211e548/operator/0.log" Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.397777 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.397830 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.397901 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.398578 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.398628 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896" gracePeriod=600 Oct 03 00:39:56 crc kubenswrapper[4832]: I1003 00:39:56.707412 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_0c8b2b5b-8aba-499a-8e9f-5e166ede2ac2/prometheus/0.log" Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.044667 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_43340dac-0a14-49ad-b56c-22a735ee0273/elasticsearch/0.log" Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.282587 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896" exitCode=0 Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.282626 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896"} Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.282650 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerStarted","Data":"2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e"} Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.282668 4832 scope.go:117] "RemoveContainer" containerID="e44b7de9ac0dc4f733d645c5d56e63447d6c890efc8c243af73d5c7937b609ee" Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.320652 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-8xhrr_1fcde8b7-66a3-468b-a67d-784b8b5f8e21/prometheus-webhook-snmp/0.log" Oct 03 00:39:57 crc kubenswrapper[4832]: I1003 00:39:57.604587 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_7a756837-fae9-4644-9fed-56fe7576ea4c/alertmanager/0.log" Oct 03 00:40:13 crc kubenswrapper[4832]: I1003 00:40:13.499839 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-7d6c88b75b-sjdm5_bafba4cc-4e6f-47fd-be34-4547031375e2/operator/0.log" Oct 03 00:40:15 crc kubenswrapper[4832]: I1003 00:40:15.755472 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-547fdfd8bd-xdfmh_7eb70cec-c2c0-46a3-9156-8f652211e548/operator/0.log" Oct 03 00:40:16 crc kubenswrapper[4832]: I1003 00:40:16.058030 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_24586eac-145b-4e1b-b7ce-1e6116a3d3d8/qdr/0.log" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.024380 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rwc5h/must-gather-s5tt4"] Oct 03 00:40:54 crc kubenswrapper[4832]: E1003 00:40:54.026261 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-collectd" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026356 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-collectd" Oct 03 00:40:54 crc kubenswrapper[4832]: E1003 00:40:54.026436 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f724d9-3345-4991-bc41-2c52e89748f4" containerName="curl" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026494 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f724d9-3345-4991-bc41-2c52e89748f4" containerName="curl" Oct 03 00:40:54 crc kubenswrapper[4832]: E1003 00:40:54.026567 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-ceilometer" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026623 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-ceilometer" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026803 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-ceilometer" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026878 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f724d9-3345-4991-bc41-2c52e89748f4" containerName="curl" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.026960 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b2f71d-77a1-4e08-a815-9ab83ceffd54" containerName="smoketest-collectd" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.027861 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.030846 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rwc5h"/"openshift-service-ca.crt" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.035234 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rwc5h"/"kube-root-ca.crt" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.051586 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rwc5h/must-gather-s5tt4"] Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.075005 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.075070 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pqvj\" (UniqueName: \"kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.176405 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pqvj\" (UniqueName: \"kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.176501 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.176932 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.216673 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pqvj\" (UniqueName: \"kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj\") pod \"must-gather-s5tt4\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.347014 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:40:54 crc kubenswrapper[4832]: I1003 00:40:54.756798 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rwc5h/must-gather-s5tt4"] Oct 03 00:40:55 crc kubenswrapper[4832]: I1003 00:40:55.732598 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" event={"ID":"aacfe36a-5532-467d-a4cf-5a022b9337be","Type":"ContainerStarted","Data":"000a634ef6980cb3d76eae29a8f5087f4eb7e7e3715825c25b3097f42a787d9f"} Oct 03 00:40:57 crc kubenswrapper[4832]: I1003 00:40:57.920977 4832 scope.go:117] "RemoveContainer" containerID="d6d2ee1959c061b6ba3e2568f6befcba100a2a2bc342949d6491887aeed58d36" Oct 03 00:40:58 crc kubenswrapper[4832]: I1003 00:40:58.981773 4832 scope.go:117] "RemoveContainer" containerID="d323078a056e84f35a2ed2626bee82bd9e2cc82f663ca7a83e0189cb0d4d4c6f" Oct 03 00:40:59 crc kubenswrapper[4832]: I1003 00:40:59.029073 4832 scope.go:117] "RemoveContainer" containerID="1d81fc5a6cca6d635409fe31566be937cfca1450fde32e3e86de414cdf57eee8" Oct 03 00:40:59 crc kubenswrapper[4832]: I1003 00:40:59.073209 4832 scope.go:117] "RemoveContainer" containerID="4555006c9fe0ac277c5d092c57890032831e69808062de2dbc38fbf6aca54357" Oct 03 00:40:59 crc kubenswrapper[4832]: I1003 00:40:59.763139 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" event={"ID":"aacfe36a-5532-467d-a4cf-5a022b9337be","Type":"ContainerStarted","Data":"8fe4eb825e8d6b6ee0a691314ccaf6b663c5a5d52c95b0ca3c414d5f72db9f34"} Oct 03 00:40:59 crc kubenswrapper[4832]: I1003 00:40:59.763452 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" event={"ID":"aacfe36a-5532-467d-a4cf-5a022b9337be","Type":"ContainerStarted","Data":"0a0b2a97fb24602961b2f0da24744e0d092d060b30a1a0c9aefb9ae1c0b94e73"} Oct 03 00:40:59 crc kubenswrapper[4832]: I1003 00:40:59.778246 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" podStartSLOduration=1.510930985 podStartE2EDuration="5.778229602s" podCreationTimestamp="2025-10-03 00:40:54 +0000 UTC" firstStartedPulling="2025-10-03 00:40:54.76298696 +0000 UTC m=+2048.025142086" lastFinishedPulling="2025-10-03 00:40:59.030285597 +0000 UTC m=+2052.292440703" observedRunningTime="2025-10-03 00:40:59.777016673 +0000 UTC m=+2053.039171779" watchObservedRunningTime="2025-10-03 00:40:59.778229602 +0000 UTC m=+2053.040384708" Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.561272 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.562951 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.568887 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.609863 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xfvg\" (UniqueName: \"kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg\") pod \"infrawatch-operators-4t4wp\" (UID: \"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4\") " pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.711360 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xfvg\" (UniqueName: \"kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg\") pod \"infrawatch-operators-4t4wp\" (UID: \"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4\") " pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.729101 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xfvg\" (UniqueName: \"kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg\") pod \"infrawatch-operators-4t4wp\" (UID: \"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4\") " pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:08 crc kubenswrapper[4832]: I1003 00:41:08.882030 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:09 crc kubenswrapper[4832]: I1003 00:41:09.122142 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:09 crc kubenswrapper[4832]: I1003 00:41:09.140664 4832 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 00:41:09 crc kubenswrapper[4832]: I1003 00:41:09.827399 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4t4wp" event={"ID":"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4","Type":"ContainerStarted","Data":"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd"} Oct 03 00:41:09 crc kubenswrapper[4832]: I1003 00:41:09.827694 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4t4wp" event={"ID":"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4","Type":"ContainerStarted","Data":"03558d78d9bff90a0adfa5614a1934a02460c319fbf11dd2cf7120ccc0244c9c"} Oct 03 00:41:09 crc kubenswrapper[4832]: I1003 00:41:09.841361 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-4t4wp" podStartSLOduration=1.671223301 podStartE2EDuration="1.841344981s" podCreationTimestamp="2025-10-03 00:41:08 +0000 UTC" firstStartedPulling="2025-10-03 00:41:09.140435028 +0000 UTC m=+2062.402590134" lastFinishedPulling="2025-10-03 00:41:09.310556708 +0000 UTC m=+2062.572711814" observedRunningTime="2025-10-03 00:41:09.839796082 +0000 UTC m=+2063.101951198" watchObservedRunningTime="2025-10-03 00:41:09.841344981 +0000 UTC m=+2063.103500087" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.148167 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.149956 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.160580 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.227641 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.227736 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.228023 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktkb\" (UniqueName: \"kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.329769 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktkb\" (UniqueName: \"kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.329902 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.329984 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.330488 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.330534 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.372577 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktkb\" (UniqueName: \"kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb\") pod \"redhat-operators-wqnb2\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:17 crc kubenswrapper[4832]: I1003 00:41:17.475141 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.004292 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.882199 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.882525 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.902943 4832 generic.go:334] "Generic (PLEG): container finished" podID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerID="571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b" exitCode=0 Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.902987 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerDied","Data":"571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b"} Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.903011 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerStarted","Data":"4980585a79c7409164f41dcce4da282d51b7e46358c92e0089f3544eea6315cd"} Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.914098 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:18 crc kubenswrapper[4832]: I1003 00:41:18.945635 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:20 crc kubenswrapper[4832]: I1003 00:41:20.916523 4832 generic.go:334] "Generic (PLEG): container finished" podID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerID="680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22" exitCode=0 Oct 03 00:41:20 crc kubenswrapper[4832]: I1003 00:41:20.916625 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerDied","Data":"680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22"} Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.335980 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.336243 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-4t4wp" podUID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" containerName="registry-server" containerID="cri-o://386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd" gracePeriod=2 Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.683863 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.811996 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xfvg\" (UniqueName: \"kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg\") pod \"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4\" (UID: \"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4\") " Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.817375 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg" (OuterVolumeSpecName: "kube-api-access-8xfvg") pod "7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" (UID: "7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4"). InnerVolumeSpecName "kube-api-access-8xfvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.913872 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xfvg\" (UniqueName: \"kubernetes.io/projected/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4-kube-api-access-8xfvg\") on node \"crc\" DevicePath \"\"" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.926557 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerStarted","Data":"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb"} Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.928204 4832 generic.go:334] "Generic (PLEG): container finished" podID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" containerID="386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd" exitCode=0 Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.928239 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4t4wp" event={"ID":"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4","Type":"ContainerDied","Data":"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd"} Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.928264 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4t4wp" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.928317 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4t4wp" event={"ID":"7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4","Type":"ContainerDied","Data":"03558d78d9bff90a0adfa5614a1934a02460c319fbf11dd2cf7120ccc0244c9c"} Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.928339 4832 scope.go:117] "RemoveContainer" containerID="386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.950127 4832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wqnb2" podStartSLOduration=2.4288293579999998 podStartE2EDuration="4.950113193s" podCreationTimestamp="2025-10-03 00:41:17 +0000 UTC" firstStartedPulling="2025-10-03 00:41:18.904830918 +0000 UTC m=+2072.166986024" lastFinishedPulling="2025-10-03 00:41:21.426114753 +0000 UTC m=+2074.688269859" observedRunningTime="2025-10-03 00:41:21.948384879 +0000 UTC m=+2075.210539985" watchObservedRunningTime="2025-10-03 00:41:21.950113193 +0000 UTC m=+2075.212268299" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.965921 4832 scope.go:117] "RemoveContainer" containerID="386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd" Oct 03 00:41:21 crc kubenswrapper[4832]: E1003 00:41:21.966356 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd\": container with ID starting with 386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd not found: ID does not exist" containerID="386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.966393 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd"} err="failed to get container status \"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd\": rpc error: code = NotFound desc = could not find container \"386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd\": container with ID starting with 386b8c18be588c5d123c373fa4d8ebc4da31fcc3d70e6939f5827de83b1c75bd not found: ID does not exist" Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.968888 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:21 crc kubenswrapper[4832]: I1003 00:41:21.971277 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-4t4wp"] Oct 03 00:41:23 crc kubenswrapper[4832]: I1003 00:41:23.395995 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" path="/var/lib/kubelet/pods/7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4/volumes" Oct 03 00:41:27 crc kubenswrapper[4832]: I1003 00:41:27.476190 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:27 crc kubenswrapper[4832]: I1003 00:41:27.476673 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:27 crc kubenswrapper[4832]: I1003 00:41:27.524128 4832 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:28 crc kubenswrapper[4832]: I1003 00:41:28.007908 4832 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:28 crc kubenswrapper[4832]: I1003 00:41:28.051145 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:29 crc kubenswrapper[4832]: I1003 00:41:29.978140 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wqnb2" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="registry-server" containerID="cri-o://e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb" gracePeriod=2 Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.321451 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.424721 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content\") pod \"c0f98fad-43c4-4992-bcc9-9211485960c3\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.424812 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities\") pod \"c0f98fad-43c4-4992-bcc9-9211485960c3\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.425256 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities" (OuterVolumeSpecName: "utilities") pod "c0f98fad-43c4-4992-bcc9-9211485960c3" (UID: "c0f98fad-43c4-4992-bcc9-9211485960c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.424847 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktkb\" (UniqueName: \"kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb\") pod \"c0f98fad-43c4-4992-bcc9-9211485960c3\" (UID: \"c0f98fad-43c4-4992-bcc9-9211485960c3\") " Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.425563 4832 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.434006 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb" (OuterVolumeSpecName: "kube-api-access-nktkb") pod "c0f98fad-43c4-4992-bcc9-9211485960c3" (UID: "c0f98fad-43c4-4992-bcc9-9211485960c3"). InnerVolumeSpecName "kube-api-access-nktkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.509785 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0f98fad-43c4-4992-bcc9-9211485960c3" (UID: "c0f98fad-43c4-4992-bcc9-9211485960c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.527067 4832 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f98fad-43c4-4992-bcc9-9211485960c3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.527107 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktkb\" (UniqueName: \"kubernetes.io/projected/c0f98fad-43c4-4992-bcc9-9211485960c3-kube-api-access-nktkb\") on node \"crc\" DevicePath \"\"" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.989043 4832 generic.go:334] "Generic (PLEG): container finished" podID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerID="e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb" exitCode=0 Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.989083 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerDied","Data":"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb"} Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.989108 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnb2" event={"ID":"c0f98fad-43c4-4992-bcc9-9211485960c3","Type":"ContainerDied","Data":"4980585a79c7409164f41dcce4da282d51b7e46358c92e0089f3544eea6315cd"} Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.989125 4832 scope.go:117] "RemoveContainer" containerID="e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb" Oct 03 00:41:30 crc kubenswrapper[4832]: I1003 00:41:30.989262 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnb2" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.020141 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.026100 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wqnb2"] Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.029569 4832 scope.go:117] "RemoveContainer" containerID="680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.062511 4832 scope.go:117] "RemoveContainer" containerID="571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.079655 4832 scope.go:117] "RemoveContainer" containerID="e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb" Oct 03 00:41:31 crc kubenswrapper[4832]: E1003 00:41:31.080291 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb\": container with ID starting with e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb not found: ID does not exist" containerID="e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.080333 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb"} err="failed to get container status \"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb\": rpc error: code = NotFound desc = could not find container \"e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb\": container with ID starting with e29a5c332a11fcf30af1f283465d20b9a083b0cbf637b00827fbf68f15a76ffb not found: ID does not exist" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.080361 4832 scope.go:117] "RemoveContainer" containerID="680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22" Oct 03 00:41:31 crc kubenswrapper[4832]: E1003 00:41:31.080768 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22\": container with ID starting with 680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22 not found: ID does not exist" containerID="680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.080828 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22"} err="failed to get container status \"680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22\": rpc error: code = NotFound desc = could not find container \"680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22\": container with ID starting with 680ef754280963fe2197853743fbd1538fb497fdc42dec8677f81a324e0ccd22 not found: ID does not exist" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.080858 4832 scope.go:117] "RemoveContainer" containerID="571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b" Oct 03 00:41:31 crc kubenswrapper[4832]: E1003 00:41:31.081306 4832 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b\": container with ID starting with 571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b not found: ID does not exist" containerID="571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.081332 4832 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b"} err="failed to get container status \"571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b\": rpc error: code = NotFound desc = could not find container \"571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b\": container with ID starting with 571cbf2d6b3e727a4718aeb2097e4bee89d3316d0dda6742cffb090ee704709b not found: ID does not exist" Oct 03 00:41:31 crc kubenswrapper[4832]: I1003 00:41:31.396423 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" path="/var/lib/kubelet/pods/c0f98fad-43c4-4992-bcc9-9211485960c3/volumes" Oct 03 00:41:35 crc kubenswrapper[4832]: I1003 00:41:35.348236 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9kl2v_02e3e27d-2b35-47ed-a82f-686e4ef7dfd8/control-plane-machine-set-operator/0.log" Oct 03 00:41:35 crc kubenswrapper[4832]: I1003 00:41:35.485191 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v4h7f_bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8/kube-rbac-proxy/0.log" Oct 03 00:41:35 crc kubenswrapper[4832]: I1003 00:41:35.501108 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v4h7f_bb3f4c9e-e9cb-4188-9a87-8bdefa5e0ee8/machine-api-operator/0.log" Oct 03 00:41:45 crc kubenswrapper[4832]: I1003 00:41:45.290282 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-p6wlh_88ee0f88-75e2-46d8-ba6e-fc853e8cdea4/cert-manager-controller/0.log" Oct 03 00:41:45 crc kubenswrapper[4832]: I1003 00:41:45.438878 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-tbgqj_270b08f1-756c-4eb7-b0bc-59cefba57eb1/cert-manager-webhook/0.log" Oct 03 00:41:45 crc kubenswrapper[4832]: I1003 00:41:45.442744 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-gd7wc_45a1b7f0-a950-421d-978a-651d666f9399/cert-manager-cainjector/0.log" Oct 03 00:41:56 crc kubenswrapper[4832]: I1003 00:41:56.397290 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:41:56 crc kubenswrapper[4832]: I1003 00:41:56.397844 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.057900 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.206739 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.237892 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.273544 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.416049 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.427015 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.439980 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69cj7kb_b47ed99b-5898-4dc0-908c-b3edb092437a/extract/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.574844 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.709549 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.728146 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.742963 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.888961 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/pull/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.896965 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/util/0.log" Oct 03 00:41:58 crc kubenswrapper[4832]: I1003 00:41:58.921803 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fx77t9_d1d45f3c-b296-4f03-99c2-dfa01faf7785/extract/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.057794 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/util/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.194312 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/util/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.236447 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/pull/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.241126 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/pull/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.373272 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/pull/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.374488 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/util/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.403833 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djgm5j_70a18e90-8200-46bb-b7e4-cde7e0ff2742/extract/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.528088 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-utilities/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.686413 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-utilities/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.691658 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-content/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.694117 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-content/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.829649 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-utilities/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.833731 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/extract-content/0.log" Oct 03 00:41:59 crc kubenswrapper[4832]: I1003 00:41:59.984836 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-utilities/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.099410 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gxnkl_c9594493-a290-4381-8724-3037db3c8c26/registry-server/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.217423 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-content/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.227655 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-utilities/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.227964 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-content/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.390826 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-utilities/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.462877 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/extract-content/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.601488 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/util/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.707744 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lfrv7_162803e4-f5f3-4b74-bc49-f78ee88caefa/registry-server/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.773967 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/pull/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.781261 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/pull/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.800140 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/util/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.941429 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/util/0.log" Oct 03 00:42:00 crc kubenswrapper[4832]: I1003 00:42:00.948037 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/pull/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.004663 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_dec6855936450d9745a6250315030408d50c8ea57f37cefa88be37c927nsvjt_d031b5fc-f2b1-4d17-bbaf-bdee585e907d/extract/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.050508 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6jnq4_547c7572-9eeb-4437-a528-2b1cc5eb573b/marketplace-operator/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.137112 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-utilities/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.260409 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-utilities/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.293783 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-content/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.320818 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-content/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.598632 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-content/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.598797 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/extract-utilities/0.log" Oct 03 00:42:01 crc kubenswrapper[4832]: I1003 00:42:01.879584 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gk9cm_8246614a-4293-4792-9a59-dafd437ccd03/registry-server/0.log" Oct 03 00:42:11 crc kubenswrapper[4832]: I1003 00:42:11.154846 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-97fx8_79f70684-a7c1-4ae6-97f5-9cf0d32f472f/prometheus-operator/0.log" Oct 03 00:42:11 crc kubenswrapper[4832]: I1003 00:42:11.258903 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69f4747886-mg6lg_0e1c4356-58d2-49a9-aca4-bccb8dac4089/prometheus-operator-admission-webhook/0.log" Oct 03 00:42:11 crc kubenswrapper[4832]: I1003 00:42:11.353229 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69f4747886-tbzpl_d0c894cc-8a02-4c23-b822-e4652aa0fde8/prometheus-operator-admission-webhook/0.log" Oct 03 00:42:11 crc kubenswrapper[4832]: I1003 00:42:11.448935 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-482lq_ac6102f2-3843-451a-bcd2-cbb010cb5466/operator/0.log" Oct 03 00:42:11 crc kubenswrapper[4832]: I1003 00:42:11.548390 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-48qqv_d4268b16-87c1-4af3-aa56-b86e56183f43/perses-operator/0.log" Oct 03 00:42:26 crc kubenswrapper[4832]: I1003 00:42:26.397199 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:42:26 crc kubenswrapper[4832]: I1003 00:42:26.397711 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:42:54 crc kubenswrapper[4832]: I1003 00:42:54.560909 4832 generic.go:334] "Generic (PLEG): container finished" podID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerID="0a0b2a97fb24602961b2f0da24744e0d092d060b30a1a0c9aefb9ae1c0b94e73" exitCode=0 Oct 03 00:42:54 crc kubenswrapper[4832]: I1003 00:42:54.560999 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" event={"ID":"aacfe36a-5532-467d-a4cf-5a022b9337be","Type":"ContainerDied","Data":"0a0b2a97fb24602961b2f0da24744e0d092d060b30a1a0c9aefb9ae1c0b94e73"} Oct 03 00:42:54 crc kubenswrapper[4832]: I1003 00:42:54.561893 4832 scope.go:117] "RemoveContainer" containerID="0a0b2a97fb24602961b2f0da24744e0d092d060b30a1a0c9aefb9ae1c0b94e73" Oct 03 00:42:54 crc kubenswrapper[4832]: I1003 00:42:54.791128 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rwc5h_must-gather-s5tt4_aacfe36a-5532-467d-a4cf-5a022b9337be/gather/0.log" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.396871 4832 patch_prober.go:28] interesting pod/machine-config-daemon-xzpk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.396935 4832 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.396983 4832 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.397685 4832 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e"} pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.397742 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerName="machine-config-daemon" containerID="cri-o://2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" gracePeriod=600 Oct 03 00:42:56 crc kubenswrapper[4832]: E1003 00:42:56.521962 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.578652 4832 generic.go:334] "Generic (PLEG): container finished" podID="13eeb165-19f7-4e3f-a442-1c9c570d853b" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" exitCode=0 Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.578706 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" event={"ID":"13eeb165-19f7-4e3f-a442-1c9c570d853b","Type":"ContainerDied","Data":"2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e"} Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.578765 4832 scope.go:117] "RemoveContainer" containerID="4afdf1456e895a3ee6a789bc43670d5f515cd55a0ecec2fcf0dfe7cbf97d3896" Oct 03 00:42:56 crc kubenswrapper[4832]: I1003 00:42:56.579238 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:42:56 crc kubenswrapper[4832]: E1003 00:42:56.579433 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.258823 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rwc5h/must-gather-s5tt4"] Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.259400 4832 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="copy" containerID="cri-o://8fe4eb825e8d6b6ee0a691314ccaf6b663c5a5d52c95b0ca3c414d5f72db9f34" gracePeriod=2 Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.263493 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rwc5h/must-gather-s5tt4"] Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.619102 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rwc5h_must-gather-s5tt4_aacfe36a-5532-467d-a4cf-5a022b9337be/copy/0.log" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.619622 4832 generic.go:334] "Generic (PLEG): container finished" podID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerID="8fe4eb825e8d6b6ee0a691314ccaf6b663c5a5d52c95b0ca3c414d5f72db9f34" exitCode=143 Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.619661 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000a634ef6980cb3d76eae29a8f5087f4eb7e7e3715825c25b3097f42a787d9f" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.648429 4832 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rwc5h_must-gather-s5tt4_aacfe36a-5532-467d-a4cf-5a022b9337be/copy/0.log" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.648843 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.732335 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pqvj\" (UniqueName: \"kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj\") pod \"aacfe36a-5532-467d-a4cf-5a022b9337be\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.732409 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output\") pod \"aacfe36a-5532-467d-a4cf-5a022b9337be\" (UID: \"aacfe36a-5532-467d-a4cf-5a022b9337be\") " Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.739099 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj" (OuterVolumeSpecName: "kube-api-access-2pqvj") pod "aacfe36a-5532-467d-a4cf-5a022b9337be" (UID: "aacfe36a-5532-467d-a4cf-5a022b9337be"). InnerVolumeSpecName "kube-api-access-2pqvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.793587 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "aacfe36a-5532-467d-a4cf-5a022b9337be" (UID: "aacfe36a-5532-467d-a4cf-5a022b9337be"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.834605 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pqvj\" (UniqueName: \"kubernetes.io/projected/aacfe36a-5532-467d-a4cf-5a022b9337be-kube-api-access-2pqvj\") on node \"crc\" DevicePath \"\"" Oct 03 00:43:01 crc kubenswrapper[4832]: I1003 00:43:01.834651 4832 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aacfe36a-5532-467d-a4cf-5a022b9337be-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 00:43:02 crc kubenswrapper[4832]: I1003 00:43:02.625057 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rwc5h/must-gather-s5tt4" Oct 03 00:43:03 crc kubenswrapper[4832]: I1003 00:43:03.396402 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" path="/var/lib/kubelet/pods/aacfe36a-5532-467d-a4cf-5a022b9337be/volumes" Oct 03 00:43:09 crc kubenswrapper[4832]: I1003 00:43:09.387069 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:43:09 crc kubenswrapper[4832]: E1003 00:43:09.387875 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:43:23 crc kubenswrapper[4832]: I1003 00:43:23.386535 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:43:23 crc kubenswrapper[4832]: E1003 00:43:23.387319 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:43:36 crc kubenswrapper[4832]: I1003 00:43:36.386236 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:43:36 crc kubenswrapper[4832]: E1003 00:43:36.386788 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:43:50 crc kubenswrapper[4832]: I1003 00:43:50.386233 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:43:50 crc kubenswrapper[4832]: E1003 00:43:50.386929 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:44:01 crc kubenswrapper[4832]: I1003 00:44:01.386327 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:44:01 crc kubenswrapper[4832]: E1003 00:44:01.387120 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:44:16 crc kubenswrapper[4832]: I1003 00:44:16.387163 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:44:16 crc kubenswrapper[4832]: E1003 00:44:16.389079 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:44:29 crc kubenswrapper[4832]: I1003 00:44:29.387480 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:44:29 crc kubenswrapper[4832]: E1003 00:44:29.389167 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:44:43 crc kubenswrapper[4832]: I1003 00:44:43.386025 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:44:43 crc kubenswrapper[4832]: E1003 00:44:43.386701 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:44:56 crc kubenswrapper[4832]: I1003 00:44:56.387204 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:44:56 crc kubenswrapper[4832]: E1003 00:44:56.387923 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.154807 4832 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944"] Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155483 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="gather" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155508 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="gather" Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155530 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155540 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155565 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="copy" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155577 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="copy" Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155597 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155607 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155625 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="extract-content" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155632 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="extract-content" Oct 03 00:45:00 crc kubenswrapper[4832]: E1003 00:45:00.155643 4832 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="extract-utilities" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155652 4832 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="extract-utilities" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155886 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb9f702-ddad-4f9d-a00c-f7a31b8fdeb4" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155906 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="gather" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155920 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f98fad-43c4-4992-bcc9-9211485960c3" containerName="registry-server" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.155933 4832 memory_manager.go:354] "RemoveStaleState removing state" podUID="aacfe36a-5532-467d-a4cf-5a022b9337be" containerName="copy" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.156594 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.158061 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.158384 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4rp\" (UniqueName: \"kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.158538 4832 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.158901 4832 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.159662 4832 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.167190 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944"] Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.260019 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4rp\" (UniqueName: \"kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.260313 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.260421 4832 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.263602 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.270588 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.276865 4832 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4rp\" (UniqueName: \"kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp\") pod \"collect-profiles-29324205-rx944\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.483645 4832 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:00 crc kubenswrapper[4832]: I1003 00:45:00.721876 4832 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944"] Oct 03 00:45:01 crc kubenswrapper[4832]: I1003 00:45:01.501550 4832 generic.go:334] "Generic (PLEG): container finished" podID="97f5d00f-5e8c-4477-9194-6a32a2c238e9" containerID="db7fdaac4d9515f5eabdacd844d727998349cd24fad6d4664d6cf6bfb2a6fe33" exitCode=0 Oct 03 00:45:01 crc kubenswrapper[4832]: I1003 00:45:01.501805 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" event={"ID":"97f5d00f-5e8c-4477-9194-6a32a2c238e9","Type":"ContainerDied","Data":"db7fdaac4d9515f5eabdacd844d727998349cd24fad6d4664d6cf6bfb2a6fe33"} Oct 03 00:45:01 crc kubenswrapper[4832]: I1003 00:45:01.501830 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" event={"ID":"97f5d00f-5e8c-4477-9194-6a32a2c238e9","Type":"ContainerStarted","Data":"fef7f04649abcb7cfee3c965fa1e82d59c8df08c3ede335bb9403c21e589cbaf"} Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.776321 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.794703 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4rp\" (UniqueName: \"kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp\") pod \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.796810 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume\") pod \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.796927 4832 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume\") pod \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\" (UID: \"97f5d00f-5e8c-4477-9194-6a32a2c238e9\") " Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.797804 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "97f5d00f-5e8c-4477-9194-6a32a2c238e9" (UID: "97f5d00f-5e8c-4477-9194-6a32a2c238e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.799088 4832 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97f5d00f-5e8c-4477-9194-6a32a2c238e9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.804512 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97f5d00f-5e8c-4477-9194-6a32a2c238e9" (UID: "97f5d00f-5e8c-4477-9194-6a32a2c238e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.815506 4832 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp" (OuterVolumeSpecName: "kube-api-access-zg4rp") pod "97f5d00f-5e8c-4477-9194-6a32a2c238e9" (UID: "97f5d00f-5e8c-4477-9194-6a32a2c238e9"). InnerVolumeSpecName "kube-api-access-zg4rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.901137 4832 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4rp\" (UniqueName: \"kubernetes.io/projected/97f5d00f-5e8c-4477-9194-6a32a2c238e9-kube-api-access-zg4rp\") on node \"crc\" DevicePath \"\"" Oct 03 00:45:02 crc kubenswrapper[4832]: I1003 00:45:02.901189 4832 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97f5d00f-5e8c-4477-9194-6a32a2c238e9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 00:45:03 crc kubenswrapper[4832]: I1003 00:45:03.515226 4832 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" event={"ID":"97f5d00f-5e8c-4477-9194-6a32a2c238e9","Type":"ContainerDied","Data":"fef7f04649abcb7cfee3c965fa1e82d59c8df08c3ede335bb9403c21e589cbaf"} Oct 03 00:45:03 crc kubenswrapper[4832]: I1003 00:45:03.515267 4832 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fef7f04649abcb7cfee3c965fa1e82d59c8df08c3ede335bb9403c21e589cbaf" Oct 03 00:45:03 crc kubenswrapper[4832]: I1003 00:45:03.515297 4832 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324205-rx944" Oct 03 00:45:03 crc kubenswrapper[4832]: I1003 00:45:03.838460 4832 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb"] Oct 03 00:45:03 crc kubenswrapper[4832]: I1003 00:45:03.842886 4832 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324160-nmbdb"] Oct 03 00:45:05 crc kubenswrapper[4832]: I1003 00:45:05.411100 4832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78dfd8f-d109-412c-afcc-2e9fe3786bd3" path="/var/lib/kubelet/pods/e78dfd8f-d109-412c-afcc-2e9fe3786bd3/volumes" Oct 03 00:45:07 crc kubenswrapper[4832]: I1003 00:45:07.402239 4832 scope.go:117] "RemoveContainer" containerID="2bc2358c16385c66ec41ae19ebbdd194d417b76256418784be10fdb6f0a63e4e" Oct 03 00:45:07 crc kubenswrapper[4832]: E1003 00:45:07.402617 4832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xzpk4_openshift-machine-config-operator(13eeb165-19f7-4e3f-a442-1c9c570d853b)\"" pod="openshift-machine-config-operator/machine-config-daemon-xzpk4" podUID="13eeb165-19f7-4e3f-a442-1c9c570d853b"